mirror of
https://github.com/mjl-/mox.git
synced 2024-12-25 16:03:48 +03:00
add a webapi and webhooks for a simple http/json-based api
for applications to compose/send messages, receive delivery feedback, and maintain suppression lists. this is an alternative to applications using a library to compose messages, submitting those messages using smtp, and monitoring a mailbox with imap for DSNs, which can be processed into the equivalent of suppression lists. but you need to know about all these standards/protocols and find libraries. by using the webapi & webhooks, you just need a http & json library. unfortunately, there is no standard for these kinds of api, so mox has made up yet another one... matching incoming DSNs about deliveries to original outgoing messages requires keeping history of "retired" messages (delivered from the queue, either successfully or failed). this can be enabled per account. history is also useful for debugging deliveries. we now also keep history of each delivery attempt, accessible while still in the queue, and kept when a message is retired. the queue webadmin pages now also have pagination, to show potentially large history. a queue of webhook calls is now managed too. failures are retried similar to message deliveries. webhooks can also be saved to the retired list after completing. also configurable per account. messages can be sent with a "unique smtp mail from" address. this can only be used if the domain is configured with a localpart catchall separator such as "+". when enabled, a queued message gets assigned a random "fromid", which is added after the separator when sending. when DSNs are returned, they can be related to previously sent messages based on this fromid. in the future, we can implement matching on the "envid" used in the smtp dsn extension, or on the "message-id" of the message. using a fromid can be triggered by authenticating with a login email address that is configured as enabling fromid. suppression lists are automatically managed per account. if a delivery attempt results in certain smtp errors, the destination address is added to the suppression list. future messages queued for that recipient will immediately fail without a delivery attempt. suppression lists protect your mail server reputation. submitted messages can carry "extra" data through the queue and webhooks for outgoing deliveries. through webapi as a json object, through smtp submission as message headers of the form "x-mox-extra-<key>: value". to make it easy to test webapi/webhooks locally, the "localserve" mode actually puts messages in the queue. when it's time to deliver, it still won't do a full delivery attempt, but just delivers to the sender account. unless the recipient address has a special form, simulating a failure to deliver. admins now have more control over the queue. "hold rules" can be added to mark newly queued messages as "on hold", pausing delivery. rules can be about certain sender or recipient domains/addresses, or apply to all messages pausing the entire queue. also useful for (local) testing. new config options have been introduced. they are editable through the admin and/or account web interfaces. the webapi http endpoints are enabled for newly generated configs with the quickstart, and in localserve. existing configurations must explicitly enable the webapi in mox.conf. gopherwatch.org was created to dogfood this code. it initially used just the compose/smtpclient/imapclient mox packages to send messages and process delivery feedback. it will get a config option to use the mox webapi/webhooks instead. the gopherwatch code to use webapi/webhook is smaller and simpler, and developing that shaped development of the mox webapi/webhooks. for issue #31 by cuu508
This commit is contained in:
parent
8bec5ef7d4
commit
09fcc49223
87 changed files with 15556 additions and 1306 deletions
22
README.md
22
README.md
|
@ -33,6 +33,8 @@ See Quickstart below to get started.
|
||||||
support is limited).
|
support is limited).
|
||||||
- Webserver with serving static files and forwarding requests (reverse
|
- Webserver with serving static files and forwarding requests (reverse
|
||||||
proxy), so port 443 can also be used to serve websites.
|
proxy), so port 443 can also be used to serve websites.
|
||||||
|
- Simple HTTP/JSON API for sending transaction email and receiving delivery
|
||||||
|
events and incoming messages (webapi and webhooks).
|
||||||
- Prometheus metrics and structured logging for operational insight.
|
- Prometheus metrics and structured logging for operational insight.
|
||||||
- "mox localserve" subcommand for running mox locally for email-related
|
- "mox localserve" subcommand for running mox locally for email-related
|
||||||
testing/developing, including pedantic mode.
|
testing/developing, including pedantic mode.
|
||||||
|
@ -133,12 +135,13 @@ https://nlnet.nl/project/Mox/.
|
||||||
|
|
||||||
## Roadmap
|
## Roadmap
|
||||||
|
|
||||||
|
- Aliases, for delivering to multiple local accounts.
|
||||||
- Webmail improvements
|
- Webmail improvements
|
||||||
- HTTP-based API for sending messages and receiving delivery feedback
|
|
||||||
- Calendaring with CalDAV/iCal
|
- Calendaring with CalDAV/iCal
|
||||||
- More IMAP extensions (PREVIEW, WITHIN, IMPORTANT, COMPRESS=DEFLATE,
|
- More IMAP extensions (PREVIEW, WITHIN, IMPORTANT, COMPRESS=DEFLATE,
|
||||||
CREATE-SPECIAL-USE, SAVEDATE, UNAUTHENTICATE, REPLACE, QUOTA, NOTIFY,
|
CREATE-SPECIAL-USE, SAVEDATE, UNAUTHENTICATE, REPLACE, QUOTA, NOTIFY,
|
||||||
MULTIAPPEND, OBJECTID, MULTISEARCH, THREAD, SORT)
|
MULTIAPPEND, OBJECTID, MULTISEARCH, THREAD, SORT)
|
||||||
|
- SMTP DSN extension
|
||||||
- ARC, with forwarded email from trusted source
|
- ARC, with forwarded email from trusted source
|
||||||
- Forwarding (to an external address)
|
- Forwarding (to an external address)
|
||||||
- Add special IMAP mailbox ("Queue?") that contains queued but
|
- Add special IMAP mailbox ("Queue?") that contains queued but
|
||||||
|
@ -447,6 +450,23 @@ messages, for example by replacing your Message-Id header and thereby
|
||||||
invalidating your DKIM-signatures, or rejecting messages with more than one
|
invalidating your DKIM-signatures, or rejecting messages with more than one
|
||||||
DKIM-signature.
|
DKIM-signature.
|
||||||
|
|
||||||
|
## Can I use mox to send transactional email?
|
||||||
|
|
||||||
|
Yes. While you can use SMTP submission to send messages you've composed
|
||||||
|
yourself, and monitor a mailbox for DSNs, a more convenient option is to use
|
||||||
|
the mox HTTP/JSON-based webapi and webhooks.
|
||||||
|
|
||||||
|
The mox webapi can be used to send outgoing messages that mox composes. The web
|
||||||
|
api can also be used to deal with messages stored in an account, like changing
|
||||||
|
message flags, retrieving messages in parsed form or individual parts of
|
||||||
|
multipart messages, or moving messages to another mailbox or deleting messages
|
||||||
|
altogether.
|
||||||
|
|
||||||
|
Mox webhooks can be used to receive updates about incoming and outgoing
|
||||||
|
deliveries. Mox can automatically manage per account suppression lists.
|
||||||
|
|
||||||
|
See https://www.xmox.nl/features/#hdr-webapi-and-webhooks for details.
|
||||||
|
|
||||||
## Can I use existing TLS certificates/keys?
|
## Can I use existing TLS certificates/keys?
|
||||||
|
|
||||||
Yes. The quickstart command creates a config that uses ACME with Let's Encrypt,
|
Yes. The quickstart command creates a config that uses ACME with Let's Encrypt,
|
||||||
|
|
|
@ -13,6 +13,7 @@ Below are the incompatible changes between v0.0.10 and next, per package.
|
||||||
# iprev
|
# iprev
|
||||||
|
|
||||||
# message
|
# message
|
||||||
|
- (*Composer).TextPart: changed from func(string) ([]byte, string, string) to func(string, string) ([]byte, string, string)
|
||||||
- From: changed from func(*log/slog.Logger, bool, io.ReaderAt) (github.com/mjl-/mox/smtp.Address, *Envelope, net/textproto.MIMEHeader, error) to func(*log/slog.Logger, bool, io.ReaderAt, *Part) (github.com/mjl-/mox/smtp.Address, *Envelope, net/textproto.MIMEHeader, error)
|
- From: changed from func(*log/slog.Logger, bool, io.ReaderAt) (github.com/mjl-/mox/smtp.Address, *Envelope, net/textproto.MIMEHeader, error) to func(*log/slog.Logger, bool, io.ReaderAt, *Part) (github.com/mjl-/mox/smtp.Address, *Envelope, net/textproto.MIMEHeader, error)
|
||||||
- NewComposer: changed from func(io.Writer, int64) *Composer to func(io.Writer, int64, bool) *Composer
|
- NewComposer: changed from func(io.Writer, int64) *Composer to func(io.Writer, int64, bool) *Composer
|
||||||
|
|
||||||
|
|
|
@ -16,3 +16,5 @@ spf
|
||||||
subjectpass
|
subjectpass
|
||||||
tlsrpt
|
tlsrpt
|
||||||
updates
|
updates
|
||||||
|
webapi
|
||||||
|
webhook
|
||||||
|
|
|
@ -182,6 +182,8 @@ type Listener struct {
|
||||||
AdminHTTPS WebService `sconf:"optional" sconf-doc:"Admin web interface listener like AdminHTTP, but for HTTPS. Requires a TLS config."`
|
AdminHTTPS WebService `sconf:"optional" sconf-doc:"Admin web interface listener like AdminHTTP, but for HTTPS. Requires a TLS config."`
|
||||||
WebmailHTTP WebService `sconf:"optional" sconf-doc:"Webmail client, for reading email. Default path is /webmail/."`
|
WebmailHTTP WebService `sconf:"optional" sconf-doc:"Webmail client, for reading email. Default path is /webmail/."`
|
||||||
WebmailHTTPS WebService `sconf:"optional" sconf-doc:"Webmail client, like WebmailHTTP, but for HTTPS. Requires a TLS config."`
|
WebmailHTTPS WebService `sconf:"optional" sconf-doc:"Webmail client, like WebmailHTTP, but for HTTPS. Requires a TLS config."`
|
||||||
|
WebAPIHTTP WebService `sconf:"optional" sconf-doc:"Like WebAPIHTTP, but with plain HTTP, without TLS."`
|
||||||
|
WebAPIHTTPS WebService `sconf:"optional" sconf-doc:"WebAPI, a simple HTTP/JSON-based API for email, with HTTPS (requires a TLS config). Default path is /webapi/."`
|
||||||
MetricsHTTP struct {
|
MetricsHTTP struct {
|
||||||
Enabled bool
|
Enabled bool
|
||||||
Port int `sconf:"optional" sconf-doc:"Default 8010."`
|
Port int `sconf:"optional" sconf-doc:"Default 8010."`
|
||||||
|
@ -210,7 +212,7 @@ type Listener struct {
|
||||||
} `sconf:"optional" sconf-doc:"All configured WebHandlers will serve on an enabled listener. Either ACME must be configured, or for each WebHandler domain a TLS certificate must be configured."`
|
} `sconf:"optional" sconf-doc:"All configured WebHandlers will serve on an enabled listener. Either ACME must be configured, or for each WebHandler domain a TLS certificate must be configured."`
|
||||||
}
|
}
|
||||||
|
|
||||||
// WebService is an internal web interface: webmail, account, admin.
|
// WebService is an internal web interface: webmail, webaccount, webadmin, webapi.
|
||||||
type WebService struct {
|
type WebService struct {
|
||||||
Enabled bool
|
Enabled bool
|
||||||
Port int `sconf:"optional" sconf-doc:"Default 80 for HTTP and 443 for HTTPS."`
|
Port int `sconf:"optional" sconf-doc:"Default 80 for HTTP and 443 for HTTPS."`
|
||||||
|
@ -356,6 +358,19 @@ type Route struct {
|
||||||
|
|
||||||
// todo: move RejectsMailbox to store.Mailbox.SpecialUse, possibly with "X" prefix?
|
// todo: move RejectsMailbox to store.Mailbox.SpecialUse, possibly with "X" prefix?
|
||||||
|
|
||||||
|
// note: outgoing hook events are in ../queue/hooks.go, ../mox-/config.go, ../queue.go and ../webapi/gendoc.sh. keep in sync.
|
||||||
|
|
||||||
|
type OutgoingWebhook struct {
|
||||||
|
URL string `sconf-doc:"URL to POST webhooks."`
|
||||||
|
Authorization string `sconf:"optional" sconf-doc:"If not empty, value of Authorization header to add to HTTP requests."`
|
||||||
|
Events []string `sconf:"optional" sconf-doc:"Events to send outgoing delivery notifications for. If absent, all events are sent. Valid values: delivered, suppressed, delayed, failed, relayed, expanded, canceled, unrecognized."`
|
||||||
|
}
|
||||||
|
|
||||||
|
type IncomingWebhook struct {
|
||||||
|
URL string `sconf-doc:"URL to POST webhooks to for incoming deliveries over SMTP."`
|
||||||
|
Authorization string `sconf:"optional" sconf-doc:"If not empty, value of Authorization header to add to HTTP requests."`
|
||||||
|
}
|
||||||
|
|
||||||
type SubjectPass struct {
|
type SubjectPass struct {
|
||||||
Period time.Duration `sconf-doc:"How long unique values are accepted after generating, e.g. 12h."` // todo: have a reasonable default for this?
|
Period time.Duration `sconf-doc:"How long unique values are accepted after generating, e.g. 12h."` // todo: have a reasonable default for this?
|
||||||
}
|
}
|
||||||
|
@ -368,6 +383,12 @@ type AutomaticJunkFlags struct {
|
||||||
}
|
}
|
||||||
|
|
||||||
type Account struct {
|
type Account struct {
|
||||||
|
OutgoingWebhook *OutgoingWebhook `sconf:"optional" sconf-doc:"Webhooks for events about outgoing deliveries."`
|
||||||
|
IncomingWebhook *IncomingWebhook `sconf:"optional" sconf-doc:"Webhooks for events about incoming deliveries over SMTP."`
|
||||||
|
FromIDLoginAddresses []string `sconf:"optional" sconf-doc:"Login addresses that cause outgoing email to be sent with SMTP MAIL FROM addresses with a unique id after the localpart catchall separator (which must be enabled when addresses are specified here). Any delivery status notifications (DSN, e.g. for bounces), can be related to the original message and recipient with unique id's. You can login to an account with any valid email address, including variants with the localpart catchall separator. You can use this mechanism to both send outgoing messages both with and without unique fromid for a given address."`
|
||||||
|
KeepRetiredMessagePeriod time.Duration `sconf:"optional" sconf-doc:"Period to keep messages retired from the queue (delivered or failed) around. Keeping retired messages is useful for maintaining the suppression list for transactional email, for matching incoming DSNs to sent messages, and for debugging. The time at which to clean up (remove) is calculated at retire time. E.g. 168h (1 week)."`
|
||||||
|
KeepRetiredWebhookPeriod time.Duration `sconf:"optional" sconf-doc:"Period to keep webhooks retired from the queue (delivered or failed) around. Useful for debugging. The time at which to clean up (remove) is calculated at retire time. E.g. 168h (1 week)."`
|
||||||
|
|
||||||
Domain string `sconf-doc:"Default domain for account. Deprecated behaviour: If a destination is not a full address but only a localpart, this domain is added to form a full address."`
|
Domain string `sconf-doc:"Default domain for account. Deprecated behaviour: If a destination is not a full address but only a localpart, this domain is added to form a full address."`
|
||||||
Description string `sconf:"optional" sconf-doc:"Free form description, e.g. full name or alternative contact info."`
|
Description string `sconf:"optional" sconf-doc:"Free form description, e.g. full name or alternative contact info."`
|
||||||
FullName string `sconf:"optional" sconf-doc:"Full name, to use in message From header when composing messages in webmail. Can be overridden per destination."`
|
FullName string `sconf:"optional" sconf-doc:"Full name, to use in message From header when composing messages in webmail. Can be overridden per destination."`
|
||||||
|
@ -383,10 +404,11 @@ type Account struct {
|
||||||
NoFirstTimeSenderDelay bool `sconf:"optional" sconf-doc:"Do not apply a delay to SMTP connections before accepting an incoming message from a first-time sender. Can be useful for accounts that sends automated responses and want instant replies."`
|
NoFirstTimeSenderDelay bool `sconf:"optional" sconf-doc:"Do not apply a delay to SMTP connections before accepting an incoming message from a first-time sender. Can be useful for accounts that sends automated responses and want instant replies."`
|
||||||
Routes []Route `sconf:"optional" sconf-doc:"Routes for delivering outgoing messages through the queue. Each delivery attempt evaluates these account routes, domain routes and finally global routes. The transport of the first matching route is used in the delivery attempt. If no routes match, which is the default with no configured routes, messages are delivered directly from the queue."`
|
Routes []Route `sconf:"optional" sconf-doc:"Routes for delivering outgoing messages through the queue. Each delivery attempt evaluates these account routes, domain routes and finally global routes. The transport of the first matching route is used in the delivery attempt. If no routes match, which is the default with no configured routes, messages are delivered directly from the queue."`
|
||||||
|
|
||||||
DNSDomain dns.Domain `sconf:"-"` // Parsed form of Domain.
|
DNSDomain dns.Domain `sconf:"-"` // Parsed form of Domain.
|
||||||
JunkMailbox *regexp.Regexp `sconf:"-" json:"-"`
|
JunkMailbox *regexp.Regexp `sconf:"-" json:"-"`
|
||||||
NeutralMailbox *regexp.Regexp `sconf:"-" json:"-"`
|
NeutralMailbox *regexp.Regexp `sconf:"-" json:"-"`
|
||||||
NotJunkMailbox *regexp.Regexp `sconf:"-" json:"-"`
|
NotJunkMailbox *regexp.Regexp `sconf:"-" json:"-"`
|
||||||
|
ParsedFromIDLoginAddresses []smtp.Address `sconf:"-" json:"-"`
|
||||||
}
|
}
|
||||||
|
|
||||||
type JunkFilter struct {
|
type JunkFilter struct {
|
||||||
|
|
|
@ -386,6 +386,35 @@ See https://pkg.go.dev/github.com/mjl-/sconf for details.
|
||||||
# limiting and for the "secure" status of cookies. (optional)
|
# limiting and for the "secure" status of cookies. (optional)
|
||||||
Forwarded: false
|
Forwarded: false
|
||||||
|
|
||||||
|
# Like WebAPIHTTP, but with plain HTTP, without TLS. (optional)
|
||||||
|
WebAPIHTTP:
|
||||||
|
Enabled: false
|
||||||
|
|
||||||
|
# Default 80 for HTTP and 443 for HTTPS. (optional)
|
||||||
|
Port: 0
|
||||||
|
|
||||||
|
# Path to serve requests on. (optional)
|
||||||
|
Path:
|
||||||
|
|
||||||
|
# If set, X-Forwarded-* headers are used for the remote IP address for rate
|
||||||
|
# limiting and for the "secure" status of cookies. (optional)
|
||||||
|
Forwarded: false
|
||||||
|
|
||||||
|
# WebAPI, a simple HTTP/JSON-based API for email, with HTTPS (requires a TLS
|
||||||
|
# config). Default path is /webapi/. (optional)
|
||||||
|
WebAPIHTTPS:
|
||||||
|
Enabled: false
|
||||||
|
|
||||||
|
# Default 80 for HTTP and 443 for HTTPS. (optional)
|
||||||
|
Port: 0
|
||||||
|
|
||||||
|
# Path to serve requests on. (optional)
|
||||||
|
Path:
|
||||||
|
|
||||||
|
# If set, X-Forwarded-* headers are used for the remote IP address for rate
|
||||||
|
# limiting and for the "secure" status of cookies. (optional)
|
||||||
|
Forwarded: false
|
||||||
|
|
||||||
# Serve prometheus metrics, for monitoring. You should not enable this on a public
|
# Serve prometheus metrics, for monitoring. You should not enable this on a public
|
||||||
# IP. (optional)
|
# IP. (optional)
|
||||||
MetricsHTTP:
|
MetricsHTTP:
|
||||||
|
@ -855,6 +884,53 @@ See https://pkg.go.dev/github.com/mjl-/sconf for details.
|
||||||
Accounts:
|
Accounts:
|
||||||
x:
|
x:
|
||||||
|
|
||||||
|
# Webhooks for events about outgoing deliveries. (optional)
|
||||||
|
OutgoingWebhook:
|
||||||
|
|
||||||
|
# URL to POST webhooks.
|
||||||
|
URL:
|
||||||
|
|
||||||
|
# If not empty, value of Authorization header to add to HTTP requests. (optional)
|
||||||
|
Authorization:
|
||||||
|
|
||||||
|
# Events to send outgoing delivery notifications for. If absent, all events are
|
||||||
|
# sent. Valid values: delivered, suppressed, delayed, failed, relayed, expanded,
|
||||||
|
# canceled, unrecognized. (optional)
|
||||||
|
Events:
|
||||||
|
-
|
||||||
|
|
||||||
|
# Webhooks for events about incoming deliveries over SMTP. (optional)
|
||||||
|
IncomingWebhook:
|
||||||
|
|
||||||
|
# URL to POST webhooks to for incoming deliveries over SMTP.
|
||||||
|
URL:
|
||||||
|
|
||||||
|
# If not empty, value of Authorization header to add to HTTP requests. (optional)
|
||||||
|
Authorization:
|
||||||
|
|
||||||
|
# Login addresses that cause outgoing email to be sent with SMTP MAIL FROM
|
||||||
|
# addresses with a unique id after the localpart catchall separator (which must be
|
||||||
|
# enabled when addresses are specified here). Any delivery status notifications
|
||||||
|
# (DSN, e.g. for bounces), can be related to the original message and recipient
|
||||||
|
# with unique id's. You can login to an account with any valid email address,
|
||||||
|
# including variants with the localpart catchall separator. You can use this
|
||||||
|
# mechanism to both send outgoing messages both with and without unique fromid for
|
||||||
|
# a given address. (optional)
|
||||||
|
FromIDLoginAddresses:
|
||||||
|
-
|
||||||
|
|
||||||
|
# Period to keep messages retired from the queue (delivered or failed) around.
|
||||||
|
# Keeping retired messages is useful for maintaining the suppression list for
|
||||||
|
# transactional email, for matching incoming DSNs to sent messages, and for
|
||||||
|
# debugging. The time at which to clean up (remove) is calculated at retire time.
|
||||||
|
# E.g. 168h (1 week). (optional)
|
||||||
|
KeepRetiredMessagePeriod: 0s
|
||||||
|
|
||||||
|
# Period to keep webhooks retired from the queue (delivered or failed) around.
|
||||||
|
# Useful for debugging. The time at which to clean up (remove) is calculated at
|
||||||
|
# retire time. E.g. 168h (1 week). (optional)
|
||||||
|
KeepRetiredWebhookPeriod: 0s
|
||||||
|
|
||||||
# Default domain for account. Deprecated behaviour: If a destination is not a full
|
# Default domain for account. Deprecated behaviour: If a destination is not a full
|
||||||
# address but only a localpart, this domain is added to form a full address.
|
# address but only a localpart, this domain is added to form a full address.
|
||||||
Domain:
|
Domain:
|
||||||
|
@ -1233,8 +1309,8 @@ See https://pkg.go.dev/github.com/mjl-/sconf for details.
|
||||||
# Examples
|
# Examples
|
||||||
|
|
||||||
Mox includes configuration files to illustrate common setups. You can see these
|
Mox includes configuration files to illustrate common setups. You can see these
|
||||||
examples with "mox example", and print a specific example with "mox example
|
examples with "mox config example", and print a specific example with "mox
|
||||||
<name>". Below are all examples included in mox.
|
config example <name>". Below are all examples included in mox.
|
||||||
|
|
||||||
# Example webhandlers
|
# Example webhandlers
|
||||||
|
|
||||||
|
|
367
ctl.go
367
ctl.go
|
@ -4,6 +4,7 @@ import (
|
||||||
"bufio"
|
"bufio"
|
||||||
"context"
|
"context"
|
||||||
"encoding/json"
|
"encoding/json"
|
||||||
|
"errors"
|
||||||
"fmt"
|
"fmt"
|
||||||
"io"
|
"io"
|
||||||
"log"
|
"log"
|
||||||
|
@ -27,6 +28,7 @@ import (
|
||||||
"github.com/mjl-/mox/queue"
|
"github.com/mjl-/mox/queue"
|
||||||
"github.com/mjl-/mox/smtp"
|
"github.com/mjl-/mox/smtp"
|
||||||
"github.com/mjl-/mox/store"
|
"github.com/mjl-/mox/store"
|
||||||
|
"github.com/mjl-/mox/webapi"
|
||||||
)
|
)
|
||||||
|
|
||||||
// ctl represents a connection to the ctl unix domain socket of a running mox instance.
|
// ctl represents a connection to the ctl unix domain socket of a running mox instance.
|
||||||
|
@ -294,12 +296,11 @@ func servectl(ctx context.Context, log mlog.Log, conn net.Conn, shutdown func())
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func xparseFilters(ctl *ctl, s string) (f queue.Filter) {
|
func xparseJSON(ctl *ctl, s string, v any) {
|
||||||
dec := json.NewDecoder(strings.NewReader(s))
|
dec := json.NewDecoder(strings.NewReader(s))
|
||||||
dec.DisallowUnknownFields()
|
dec.DisallowUnknownFields()
|
||||||
err := dec.Decode(&f)
|
err := dec.Decode(v)
|
||||||
ctl.xcheck(err, "parsing filters")
|
ctl.xcheck(err, "parsing from ctl as json")
|
||||||
return f
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func servectlcmd(ctx context.Context, ctl *ctl, shutdown func()) {
|
func servectlcmd(ctx context.Context, ctl *ctl, shutdown func()) {
|
||||||
|
@ -447,14 +448,17 @@ func servectlcmd(ctx context.Context, ctl *ctl, shutdown func()) {
|
||||||
|
|
||||||
case "queuelist":
|
case "queuelist":
|
||||||
/* protocol:
|
/* protocol:
|
||||||
> "queue"
|
> "queuelist"
|
||||||
> queuefilters as json
|
> filters as json
|
||||||
|
> sort as json
|
||||||
< "ok"
|
< "ok"
|
||||||
< stream
|
< stream
|
||||||
*/
|
*/
|
||||||
fs := ctl.xread()
|
var f queue.Filter
|
||||||
f := xparseFilters(ctl, fs)
|
xparseJSON(ctl, ctl.xread(), &f)
|
||||||
qmsgs, err := queue.List(ctx, f)
|
var s queue.Sort
|
||||||
|
xparseJSON(ctl, ctl.xread(), &s)
|
||||||
|
qmsgs, err := queue.List(ctx, f, s)
|
||||||
ctl.xcheck(err, "listing queue")
|
ctl.xcheck(err, "listing queue")
|
||||||
ctl.xwriteok()
|
ctl.xwriteok()
|
||||||
|
|
||||||
|
@ -465,7 +469,7 @@ func servectlcmd(ctx context.Context, ctl *ctl, shutdown func()) {
|
||||||
if qm.LastAttempt != nil {
|
if qm.LastAttempt != nil {
|
||||||
lastAttempt = time.Since(*qm.LastAttempt).Round(time.Second).String()
|
lastAttempt = time.Since(*qm.LastAttempt).Round(time.Second).String()
|
||||||
}
|
}
|
||||||
fmt.Fprintf(xw, "%5d %s from:%s to:%s next %s last %s error %q\n", qm.ID, qm.Queued.Format(time.RFC3339), qm.Sender().LogString(), qm.Recipient().LogString(), -time.Since(qm.NextAttempt).Round(time.Second), lastAttempt, qm.LastError)
|
fmt.Fprintf(xw, "%5d %s from:%s to:%s next %s last %s error %q\n", qm.ID, qm.Queued.Format(time.RFC3339), qm.Sender().LogString(), qm.Recipient().LogString(), -time.Since(qm.NextAttempt).Round(time.Second), lastAttempt, qm.LastResult().Error)
|
||||||
}
|
}
|
||||||
if len(qmsgs) == 0 {
|
if len(qmsgs) == 0 {
|
||||||
fmt.Fprint(xw, "(none)\n")
|
fmt.Fprint(xw, "(none)\n")
|
||||||
|
@ -481,8 +485,8 @@ func servectlcmd(ctx context.Context, ctl *ctl, shutdown func()) {
|
||||||
< count
|
< count
|
||||||
*/
|
*/
|
||||||
|
|
||||||
fs := ctl.xread()
|
var f queue.Filter
|
||||||
f := xparseFilters(ctl, fs)
|
xparseJSON(ctl, ctl.xread(), &f)
|
||||||
hold := ctl.xread() == "true"
|
hold := ctl.xread() == "true"
|
||||||
count, err := queue.HoldSet(ctx, f, hold)
|
count, err := queue.HoldSet(ctx, f, hold)
|
||||||
ctl.xcheck(err, "setting on hold status for messages")
|
ctl.xcheck(err, "setting on hold status for messages")
|
||||||
|
@ -499,8 +503,8 @@ func servectlcmd(ctx context.Context, ctl *ctl, shutdown func()) {
|
||||||
< count
|
< count
|
||||||
*/
|
*/
|
||||||
|
|
||||||
fs := ctl.xread()
|
var f queue.Filter
|
||||||
f := xparseFilters(ctl, fs)
|
xparseJSON(ctl, ctl.xread(), &f)
|
||||||
relnow := ctl.xread()
|
relnow := ctl.xread()
|
||||||
d, err := time.ParseDuration(ctl.xread())
|
d, err := time.ParseDuration(ctl.xread())
|
||||||
ctl.xcheck(err, "parsing duration for next delivery attempt")
|
ctl.xcheck(err, "parsing duration for next delivery attempt")
|
||||||
|
@ -523,8 +527,8 @@ func servectlcmd(ctx context.Context, ctl *ctl, shutdown func()) {
|
||||||
< count
|
< count
|
||||||
*/
|
*/
|
||||||
|
|
||||||
fs := ctl.xread()
|
var f queue.Filter
|
||||||
f := xparseFilters(ctl, fs)
|
xparseJSON(ctl, ctl.xread(), &f)
|
||||||
transport := ctl.xread()
|
transport := ctl.xread()
|
||||||
count, err := queue.TransportSet(ctx, f, transport)
|
count, err := queue.TransportSet(ctx, f, transport)
|
||||||
ctl.xcheck(err, "adding to next delivery attempts in queue")
|
ctl.xcheck(err, "adding to next delivery attempts in queue")
|
||||||
|
@ -540,8 +544,8 @@ func servectlcmd(ctx context.Context, ctl *ctl, shutdown func()) {
|
||||||
< count
|
< count
|
||||||
*/
|
*/
|
||||||
|
|
||||||
fs := ctl.xread()
|
var f queue.Filter
|
||||||
f := xparseFilters(ctl, fs)
|
xparseJSON(ctl, ctl.xread(), &f)
|
||||||
reqtls := ctl.xread()
|
reqtls := ctl.xread()
|
||||||
var req *bool
|
var req *bool
|
||||||
switch reqtls {
|
switch reqtls {
|
||||||
|
@ -568,8 +572,8 @@ func servectlcmd(ctx context.Context, ctl *ctl, shutdown func()) {
|
||||||
< count
|
< count
|
||||||
*/
|
*/
|
||||||
|
|
||||||
fs := ctl.xread()
|
var f queue.Filter
|
||||||
f := xparseFilters(ctl, fs)
|
xparseJSON(ctl, ctl.xread(), &f)
|
||||||
count, err := queue.Fail(ctx, log, f)
|
count, err := queue.Fail(ctx, log, f)
|
||||||
ctl.xcheck(err, "marking messages from queue as failed")
|
ctl.xcheck(err, "marking messages from queue as failed")
|
||||||
ctl.xwriteok()
|
ctl.xwriteok()
|
||||||
|
@ -583,8 +587,8 @@ func servectlcmd(ctx context.Context, ctl *ctl, shutdown func()) {
|
||||||
< count
|
< count
|
||||||
*/
|
*/
|
||||||
|
|
||||||
fs := ctl.xread()
|
var f queue.Filter
|
||||||
f := xparseFilters(ctl, fs)
|
xparseJSON(ctl, ctl.xread(), &f)
|
||||||
count, err := queue.Drop(ctx, log, f)
|
count, err := queue.Drop(ctx, log, f)
|
||||||
ctl.xcheck(err, "dropping messages from queue")
|
ctl.xcheck(err, "dropping messages from queue")
|
||||||
ctl.xwriteok()
|
ctl.xwriteok()
|
||||||
|
@ -612,6 +616,325 @@ func servectlcmd(ctx context.Context, ctl *ctl, shutdown func()) {
|
||||||
ctl.xwriteok()
|
ctl.xwriteok()
|
||||||
ctl.xstreamfrom(mr)
|
ctl.xstreamfrom(mr)
|
||||||
|
|
||||||
|
case "queueretiredlist":
|
||||||
|
/* protocol:
|
||||||
|
> "queueretiredlist"
|
||||||
|
> filters as json
|
||||||
|
> sort as json
|
||||||
|
< "ok"
|
||||||
|
< stream
|
||||||
|
*/
|
||||||
|
var f queue.RetiredFilter
|
||||||
|
xparseJSON(ctl, ctl.xread(), &f)
|
||||||
|
var s queue.RetiredSort
|
||||||
|
xparseJSON(ctl, ctl.xread(), &s)
|
||||||
|
qmsgs, err := queue.RetiredList(ctx, f, s)
|
||||||
|
ctl.xcheck(err, "listing retired queue")
|
||||||
|
ctl.xwriteok()
|
||||||
|
|
||||||
|
xw := ctl.writer()
|
||||||
|
fmt.Fprintln(xw, "retired messages:")
|
||||||
|
for _, qm := range qmsgs {
|
||||||
|
var lastAttempt string
|
||||||
|
if qm.LastAttempt != nil {
|
||||||
|
lastAttempt = time.Since(*qm.LastAttempt).Round(time.Second).String()
|
||||||
|
}
|
||||||
|
result := "failure"
|
||||||
|
if qm.Success {
|
||||||
|
result = "success"
|
||||||
|
}
|
||||||
|
sender, err := qm.Sender()
|
||||||
|
xcheckf(err, "parsing sender")
|
||||||
|
fmt.Fprintf(xw, "%5d %s %s from:%s to:%s last %s error %q\n", qm.ID, qm.Queued.Format(time.RFC3339), result, sender.LogString(), qm.Recipient().LogString(), lastAttempt, qm.LastResult().Error)
|
||||||
|
}
|
||||||
|
if len(qmsgs) == 0 {
|
||||||
|
fmt.Fprint(xw, "(none)\n")
|
||||||
|
}
|
||||||
|
xw.xclose()
|
||||||
|
|
||||||
|
case "queueretiredprint":
|
||||||
|
/* protocol:
|
||||||
|
> "queueretiredprint"
|
||||||
|
> id
|
||||||
|
< "ok"
|
||||||
|
< stream
|
||||||
|
*/
|
||||||
|
idstr := ctl.xread()
|
||||||
|
id, err := strconv.ParseInt(idstr, 10, 64)
|
||||||
|
if err != nil {
|
||||||
|
ctl.xcheck(err, "parsing id")
|
||||||
|
}
|
||||||
|
l, err := queue.RetiredList(ctx, queue.RetiredFilter{IDs: []int64{id}}, queue.RetiredSort{})
|
||||||
|
ctl.xcheck(err, "getting retired messages")
|
||||||
|
if len(l) == 0 {
|
||||||
|
ctl.xcheck(errors.New("not found"), "getting retired message")
|
||||||
|
}
|
||||||
|
m := l[0]
|
||||||
|
ctl.xwriteok()
|
||||||
|
xw := ctl.writer()
|
||||||
|
enc := json.NewEncoder(xw)
|
||||||
|
enc.SetIndent("", "\t")
|
||||||
|
err = enc.Encode(m)
|
||||||
|
ctl.xcheck(err, "encode retired message")
|
||||||
|
xw.xclose()
|
||||||
|
|
||||||
|
case "queuehooklist":
|
||||||
|
/* protocol:
|
||||||
|
> "queuehooklist"
|
||||||
|
> filters as json
|
||||||
|
> sort as json
|
||||||
|
< "ok"
|
||||||
|
< stream
|
||||||
|
*/
|
||||||
|
var f queue.HookFilter
|
||||||
|
xparseJSON(ctl, ctl.xread(), &f)
|
||||||
|
var s queue.HookSort
|
||||||
|
xparseJSON(ctl, ctl.xread(), &s)
|
||||||
|
hooks, err := queue.HookList(ctx, f, s)
|
||||||
|
ctl.xcheck(err, "listing webhooks")
|
||||||
|
ctl.xwriteok()
|
||||||
|
|
||||||
|
xw := ctl.writer()
|
||||||
|
fmt.Fprintln(xw, "webhooks:")
|
||||||
|
for _, h := range hooks {
|
||||||
|
var lastAttempt string
|
||||||
|
if len(h.Results) > 0 {
|
||||||
|
lastAttempt = time.Since(h.LastResult().Start).Round(time.Second).String()
|
||||||
|
}
|
||||||
|
fmt.Fprintf(xw, "%5d %s account:%s next %s last %s error %q url %s\n", h.ID, h.Submitted.Format(time.RFC3339), h.Account, time.Until(h.NextAttempt).Round(time.Second), lastAttempt, h.LastResult().Error, h.URL)
|
||||||
|
}
|
||||||
|
if len(hooks) == 0 {
|
||||||
|
fmt.Fprint(xw, "(none)\n")
|
||||||
|
}
|
||||||
|
xw.xclose()
|
||||||
|
|
||||||
|
case "queuehookschedule":
|
||||||
|
/* protocol:
|
||||||
|
> "queuehookschedule"
|
||||||
|
> hookfilters as json
|
||||||
|
> relative to now
|
||||||
|
> duration
|
||||||
|
< "ok" or error
|
||||||
|
< count
|
||||||
|
*/
|
||||||
|
|
||||||
|
var f queue.HookFilter
|
||||||
|
xparseJSON(ctl, ctl.xread(), &f)
|
||||||
|
relnow := ctl.xread()
|
||||||
|
d, err := time.ParseDuration(ctl.xread())
|
||||||
|
ctl.xcheck(err, "parsing duration for next delivery attempt")
|
||||||
|
var count int
|
||||||
|
if relnow == "" {
|
||||||
|
count, err = queue.HookNextAttemptAdd(ctx, f, d)
|
||||||
|
} else {
|
||||||
|
count, err = queue.HookNextAttemptSet(ctx, f, time.Now().Add(d))
|
||||||
|
}
|
||||||
|
ctl.xcheck(err, "setting next delivery attempts in queue")
|
||||||
|
ctl.xwriteok()
|
||||||
|
ctl.xwrite(fmt.Sprintf("%d", count))
|
||||||
|
|
||||||
|
case "queuehookcancel":
|
||||||
|
/* protocol:
|
||||||
|
> "queuehookcancel"
|
||||||
|
> hookfilters as json
|
||||||
|
< "ok" or error
|
||||||
|
< count
|
||||||
|
*/
|
||||||
|
|
||||||
|
var f queue.HookFilter
|
||||||
|
xparseJSON(ctl, ctl.xread(), &f)
|
||||||
|
count, err := queue.HookCancel(ctx, log, f)
|
||||||
|
ctl.xcheck(err, "canceling webhooks in queue")
|
||||||
|
ctl.xwriteok()
|
||||||
|
ctl.xwrite(fmt.Sprintf("%d", count))
|
||||||
|
|
||||||
|
case "queuehookprint":
|
||||||
|
/* protocol:
|
||||||
|
> "queuehookprint"
|
||||||
|
> id
|
||||||
|
< "ok"
|
||||||
|
< stream
|
||||||
|
*/
|
||||||
|
idstr := ctl.xread()
|
||||||
|
id, err := strconv.ParseInt(idstr, 10, 64)
|
||||||
|
if err != nil {
|
||||||
|
ctl.xcheck(err, "parsing id")
|
||||||
|
}
|
||||||
|
l, err := queue.HookList(ctx, queue.HookFilter{IDs: []int64{id}}, queue.HookSort{})
|
||||||
|
ctl.xcheck(err, "getting webhooks")
|
||||||
|
if len(l) == 0 {
|
||||||
|
ctl.xcheck(errors.New("not found"), "getting webhook")
|
||||||
|
}
|
||||||
|
h := l[0]
|
||||||
|
ctl.xwriteok()
|
||||||
|
xw := ctl.writer()
|
||||||
|
enc := json.NewEncoder(xw)
|
||||||
|
enc.SetIndent("", "\t")
|
||||||
|
err = enc.Encode(h)
|
||||||
|
ctl.xcheck(err, "encode webhook")
|
||||||
|
xw.xclose()
|
||||||
|
|
||||||
|
case "queuehookretiredlist":
|
||||||
|
/* protocol:
|
||||||
|
> "queuehookretiredlist"
|
||||||
|
> filters as json
|
||||||
|
> sort as json
|
||||||
|
< "ok"
|
||||||
|
< stream
|
||||||
|
*/
|
||||||
|
var f queue.HookRetiredFilter
|
||||||
|
xparseJSON(ctl, ctl.xread(), &f)
|
||||||
|
var s queue.HookRetiredSort
|
||||||
|
xparseJSON(ctl, ctl.xread(), &s)
|
||||||
|
l, err := queue.HookRetiredList(ctx, f, s)
|
||||||
|
ctl.xcheck(err, "listing retired webhooks")
|
||||||
|
ctl.xwriteok()
|
||||||
|
|
||||||
|
xw := ctl.writer()
|
||||||
|
fmt.Fprintln(xw, "retired webhooks:")
|
||||||
|
for _, h := range l {
|
||||||
|
var lastAttempt string
|
||||||
|
if len(h.Results) > 0 {
|
||||||
|
lastAttempt = time.Since(h.LastResult().Start).Round(time.Second).String()
|
||||||
|
}
|
||||||
|
result := "success"
|
||||||
|
if !h.Success {
|
||||||
|
result = "failure"
|
||||||
|
}
|
||||||
|
fmt.Fprintf(xw, "%5d %s %s account:%s last %s error %q url %s\n", h.ID, h.Submitted.Format(time.RFC3339), result, h.Account, lastAttempt, h.LastResult().Error, h.URL)
|
||||||
|
}
|
||||||
|
if len(l) == 0 {
|
||||||
|
fmt.Fprint(xw, "(none)\n")
|
||||||
|
}
|
||||||
|
xw.xclose()
|
||||||
|
|
||||||
|
case "queuehookretiredprint":
|
||||||
|
/* protocol:
|
||||||
|
> "queuehookretiredprint"
|
||||||
|
> id
|
||||||
|
< "ok"
|
||||||
|
< stream
|
||||||
|
*/
|
||||||
|
idstr := ctl.xread()
|
||||||
|
id, err := strconv.ParseInt(idstr, 10, 64)
|
||||||
|
if err != nil {
|
||||||
|
ctl.xcheck(err, "parsing id")
|
||||||
|
}
|
||||||
|
l, err := queue.HookRetiredList(ctx, queue.HookRetiredFilter{IDs: []int64{id}}, queue.HookRetiredSort{})
|
||||||
|
ctl.xcheck(err, "getting retired webhooks")
|
||||||
|
if len(l) == 0 {
|
||||||
|
ctl.xcheck(errors.New("not found"), "getting retired webhook")
|
||||||
|
}
|
||||||
|
h := l[0]
|
||||||
|
ctl.xwriteok()
|
||||||
|
xw := ctl.writer()
|
||||||
|
enc := json.NewEncoder(xw)
|
||||||
|
enc.SetIndent("", "\t")
|
||||||
|
err = enc.Encode(h)
|
||||||
|
ctl.xcheck(err, "encode retired webhook")
|
||||||
|
xw.xclose()
|
||||||
|
|
||||||
|
case "queuesuppresslist":
|
||||||
|
/* protocol:
|
||||||
|
> "queuesuppresslist"
|
||||||
|
> account (or empty)
|
||||||
|
< "ok" or error
|
||||||
|
< stream
|
||||||
|
*/
|
||||||
|
|
||||||
|
account := ctl.xread()
|
||||||
|
l, err := queue.SuppressionList(ctx, account)
|
||||||
|
ctl.xcheck(err, "listing suppressions")
|
||||||
|
ctl.xwriteok()
|
||||||
|
xw := ctl.writer()
|
||||||
|
fmt.Fprintln(xw, "suppressions (account, address, manual, time added, base adddress, reason):")
|
||||||
|
for _, sup := range l {
|
||||||
|
manual := "No"
|
||||||
|
if sup.Manual {
|
||||||
|
manual = "Yes"
|
||||||
|
}
|
||||||
|
fmt.Fprintf(xw, "%q\t%q\t%s\t%s\t%q\t%q\n", sup.Account, sup.OriginalAddress, manual, sup.Created.Round(time.Second), sup.BaseAddress, sup.Reason)
|
||||||
|
}
|
||||||
|
if len(l) == 0 {
|
||||||
|
fmt.Fprintln(xw, "(none)")
|
||||||
|
}
|
||||||
|
xw.xclose()
|
||||||
|
|
||||||
|
case "queuesuppressadd":
|
||||||
|
/* protocol:
|
||||||
|
> "queuesuppressadd"
|
||||||
|
> account
|
||||||
|
> address
|
||||||
|
< "ok" or error
|
||||||
|
*/
|
||||||
|
|
||||||
|
account := ctl.xread()
|
||||||
|
address := ctl.xread()
|
||||||
|
_, ok := mox.Conf.Account(account)
|
||||||
|
if !ok {
|
||||||
|
ctl.xcheck(errors.New("unknown account"), "looking up account")
|
||||||
|
}
|
||||||
|
addr, err := smtp.ParseAddress(address)
|
||||||
|
ctl.xcheck(err, "parsing address")
|
||||||
|
sup := webapi.Suppression{
|
||||||
|
Account: account,
|
||||||
|
Manual: true,
|
||||||
|
Reason: "added through mox cli",
|
||||||
|
}
|
||||||
|
err = queue.SuppressionAdd(ctx, addr.Path(), &sup)
|
||||||
|
ctl.xcheck(err, "adding suppression")
|
||||||
|
ctl.xwriteok()
|
||||||
|
|
||||||
|
case "queuesuppressremove":
|
||||||
|
/* protocol:
|
||||||
|
> "queuesuppressremove"
|
||||||
|
> account
|
||||||
|
> address
|
||||||
|
< "ok" or error
|
||||||
|
*/
|
||||||
|
|
||||||
|
account := ctl.xread()
|
||||||
|
address := ctl.xread()
|
||||||
|
addr, err := smtp.ParseAddress(address)
|
||||||
|
ctl.xcheck(err, "parsing address")
|
||||||
|
err = queue.SuppressionRemove(ctx, account, addr.Path())
|
||||||
|
ctl.xcheck(err, "removing suppression")
|
||||||
|
ctl.xwriteok()
|
||||||
|
|
||||||
|
case "queuesuppresslookup":
|
||||||
|
/* protocol:
|
||||||
|
> "queuesuppresslookup"
|
||||||
|
> account or empty
|
||||||
|
> address
|
||||||
|
< "ok" or error
|
||||||
|
< stream
|
||||||
|
*/
|
||||||
|
|
||||||
|
account := ctl.xread()
|
||||||
|
address := ctl.xread()
|
||||||
|
if account != "" {
|
||||||
|
_, ok := mox.Conf.Account(account)
|
||||||
|
if !ok {
|
||||||
|
ctl.xcheck(errors.New("unknown account"), "looking up account")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
addr, err := smtp.ParseAddress(address)
|
||||||
|
ctl.xcheck(err, "parsing address")
|
||||||
|
sup, err := queue.SuppressionLookup(ctx, account, addr.Path())
|
||||||
|
ctl.xcheck(err, "looking up suppression")
|
||||||
|
ctl.xwriteok()
|
||||||
|
xw := ctl.writer()
|
||||||
|
if sup == nil {
|
||||||
|
fmt.Fprintln(xw, "not present")
|
||||||
|
} else {
|
||||||
|
manual := "no"
|
||||||
|
if sup.Manual {
|
||||||
|
manual = "yes"
|
||||||
|
}
|
||||||
|
fmt.Fprintf(xw, "present\nadded: %s\nmanual: %s\nbase address: %s\nreason: %q\n", sup.Created.Round(time.Second), manual, sup.BaseAddress, sup.Reason)
|
||||||
|
}
|
||||||
|
xw.xclose()
|
||||||
|
|
||||||
case "importmaildir", "importmbox":
|
case "importmaildir", "importmbox":
|
||||||
mbox := cmd == "importmbox"
|
mbox := cmd == "importmbox"
|
||||||
importctl(ctx, ctl, mbox)
|
importctl(ctx, ctl, mbox)
|
||||||
|
|
121
ctl_test.go
121
ctl_test.go
|
@ -5,6 +5,7 @@ package main
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"flag"
|
"flag"
|
||||||
|
"fmt"
|
||||||
"net"
|
"net"
|
||||||
"os"
|
"os"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
|
@ -17,6 +18,7 @@ import (
|
||||||
"github.com/mjl-/mox/mox-"
|
"github.com/mjl-/mox/mox-"
|
||||||
"github.com/mjl-/mox/mtastsdb"
|
"github.com/mjl-/mox/mtastsdb"
|
||||||
"github.com/mjl-/mox/queue"
|
"github.com/mjl-/mox/queue"
|
||||||
|
"github.com/mjl-/mox/smtp"
|
||||||
"github.com/mjl-/mox/store"
|
"github.com/mjl-/mox/store"
|
||||||
"github.com/mjl-/mox/tlsrptdb"
|
"github.com/mjl-/mox/tlsrptdb"
|
||||||
)
|
)
|
||||||
|
@ -43,6 +45,9 @@ func TestCtl(t *testing.T) {
|
||||||
}
|
}
|
||||||
defer store.Switchboard()()
|
defer store.Switchboard()()
|
||||||
|
|
||||||
|
err := queue.Init()
|
||||||
|
tcheck(t, err, "queue init")
|
||||||
|
|
||||||
testctl := func(fn func(clientctl *ctl)) {
|
testctl := func(fn func(clientctl *ctl)) {
|
||||||
t.Helper()
|
t.Helper()
|
||||||
|
|
||||||
|
@ -65,9 +70,6 @@ func TestCtl(t *testing.T) {
|
||||||
ctlcmdSetaccountpassword(ctl, "mjl", "test4321")
|
ctlcmdSetaccountpassword(ctl, "mjl", "test4321")
|
||||||
})
|
})
|
||||||
|
|
||||||
err := queue.Init()
|
|
||||||
tcheck(t, err, "queue init")
|
|
||||||
|
|
||||||
testctl(func(ctl *ctl) {
|
testctl(func(ctl *ctl) {
|
||||||
ctlcmdQueueHoldrulesList(ctl)
|
ctlcmdQueueHoldrulesList(ctl)
|
||||||
})
|
})
|
||||||
|
@ -90,6 +92,22 @@ func TestCtl(t *testing.T) {
|
||||||
ctlcmdQueueHoldrulesRemove(ctl, 1)
|
ctlcmdQueueHoldrulesRemove(ctl, 1)
|
||||||
})
|
})
|
||||||
|
|
||||||
|
// Queue a message to list/change/dump.
|
||||||
|
msg := "Subject: subject\r\n\r\nbody\r\n"
|
||||||
|
msgFile, err := store.CreateMessageTemp(pkglog, "queuedump-test")
|
||||||
|
tcheck(t, err, "temp file")
|
||||||
|
_, err = msgFile.Write([]byte(msg))
|
||||||
|
tcheck(t, err, "write message")
|
||||||
|
_, err = msgFile.Seek(0, 0)
|
||||||
|
tcheck(t, err, "rewind message")
|
||||||
|
defer os.Remove(msgFile.Name())
|
||||||
|
defer msgFile.Close()
|
||||||
|
addr, err := smtp.ParseAddress("mjl@mox.example")
|
||||||
|
tcheck(t, err, "parse address")
|
||||||
|
qml := []queue.Msg{queue.MakeMsg(addr.Path(), addr.Path(), false, false, int64(len(msg)), "<random@localhost>", nil, nil, time.Now(), "subject")}
|
||||||
|
queue.Add(ctxbg, pkglog, "mjl", msgFile, qml...)
|
||||||
|
qmid := qml[0].ID
|
||||||
|
|
||||||
// Has entries now.
|
// Has entries now.
|
||||||
testctl(func(ctl *ctl) {
|
testctl(func(ctl *ctl) {
|
||||||
ctlcmdQueueHoldrulesList(ctl)
|
ctlcmdQueueHoldrulesList(ctl)
|
||||||
|
@ -97,13 +115,16 @@ func TestCtl(t *testing.T) {
|
||||||
|
|
||||||
// "queuelist"
|
// "queuelist"
|
||||||
testctl(func(ctl *ctl) {
|
testctl(func(ctl *ctl) {
|
||||||
ctlcmdQueueList(ctl, queue.Filter{})
|
ctlcmdQueueList(ctl, queue.Filter{}, queue.Sort{})
|
||||||
})
|
})
|
||||||
|
|
||||||
// "queueholdset"
|
// "queueholdset"
|
||||||
testctl(func(ctl *ctl) {
|
testctl(func(ctl *ctl) {
|
||||||
ctlcmdQueueHoldSet(ctl, queue.Filter{}, true)
|
ctlcmdQueueHoldSet(ctl, queue.Filter{}, true)
|
||||||
})
|
})
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueHoldSet(ctl, queue.Filter{}, false)
|
||||||
|
})
|
||||||
|
|
||||||
// "queueschedule"
|
// "queueschedule"
|
||||||
testctl(func(ctl *ctl) {
|
testctl(func(ctl *ctl) {
|
||||||
|
@ -120,6 +141,11 @@ func TestCtl(t *testing.T) {
|
||||||
ctlcmdQueueRequireTLS(ctl, queue.Filter{}, nil)
|
ctlcmdQueueRequireTLS(ctl, queue.Filter{}, nil)
|
||||||
})
|
})
|
||||||
|
|
||||||
|
// "queuedump"
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueDump(ctl, fmt.Sprintf("%d", qmid))
|
||||||
|
})
|
||||||
|
|
||||||
// "queuefail"
|
// "queuefail"
|
||||||
testctl(func(ctl *ctl) {
|
testctl(func(ctl *ctl) {
|
||||||
ctlcmdQueueFail(ctl, queue.Filter{})
|
ctlcmdQueueFail(ctl, queue.Filter{})
|
||||||
|
@ -130,7 +156,92 @@ func TestCtl(t *testing.T) {
|
||||||
ctlcmdQueueDrop(ctl, queue.Filter{})
|
ctlcmdQueueDrop(ctl, queue.Filter{})
|
||||||
})
|
})
|
||||||
|
|
||||||
// no "queuedump", we don't have a message to dump, and the commands exits without a message.
|
// "queueholdruleslist"
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueHoldrulesList(ctl)
|
||||||
|
})
|
||||||
|
|
||||||
|
// "queueholdrulesadd"
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueHoldrulesAdd(ctl, "mjl", "", "")
|
||||||
|
})
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueHoldrulesAdd(ctl, "mjl", "localhost", "")
|
||||||
|
})
|
||||||
|
|
||||||
|
// "queueholdrulesremove"
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueHoldrulesRemove(ctl, 2)
|
||||||
|
})
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueHoldrulesList(ctl)
|
||||||
|
})
|
||||||
|
|
||||||
|
// "queuesuppresslist"
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueSuppressList(ctl, "mjl")
|
||||||
|
})
|
||||||
|
|
||||||
|
// "queuesuppressadd"
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueSuppressAdd(ctl, "mjl", "base@localhost")
|
||||||
|
})
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueSuppressAdd(ctl, "mjl", "other@localhost")
|
||||||
|
})
|
||||||
|
|
||||||
|
// "queuesuppresslookup"
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueSuppressLookup(ctl, "mjl", "base@localhost")
|
||||||
|
})
|
||||||
|
|
||||||
|
// "queuesuppressremove"
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueSuppressRemove(ctl, "mjl", "base@localhost")
|
||||||
|
})
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueSuppressList(ctl, "mjl")
|
||||||
|
})
|
||||||
|
|
||||||
|
// "queueretiredlist"
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueRetiredList(ctl, queue.RetiredFilter{}, queue.RetiredSort{})
|
||||||
|
})
|
||||||
|
|
||||||
|
// "queueretiredprint"
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueRetiredPrint(ctl, "1")
|
||||||
|
})
|
||||||
|
|
||||||
|
// "queuehooklist"
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueHookList(ctl, queue.HookFilter{}, queue.HookSort{})
|
||||||
|
})
|
||||||
|
|
||||||
|
// "queuehookschedule"
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueHookSchedule(ctl, queue.HookFilter{}, true, time.Minute)
|
||||||
|
})
|
||||||
|
|
||||||
|
// "queuehookprint"
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueHookPrint(ctl, "1")
|
||||||
|
})
|
||||||
|
|
||||||
|
// "queuehookcancel"
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueHookCancel(ctl, queue.HookFilter{})
|
||||||
|
})
|
||||||
|
|
||||||
|
// "queuehookretiredlist"
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueHookRetiredList(ctl, queue.HookRetiredFilter{}, queue.HookRetiredSort{})
|
||||||
|
})
|
||||||
|
|
||||||
|
// "queuehookretiredprint"
|
||||||
|
testctl(func(ctl *ctl) {
|
||||||
|
ctlcmdQueueHookRetiredPrint(ctl, "1")
|
||||||
|
})
|
||||||
|
|
||||||
// "importmbox"
|
// "importmbox"
|
||||||
testctl(func(ctl *ctl) {
|
testctl(func(ctl *ctl) {
|
||||||
|
|
|
@ -307,7 +307,7 @@ done
|
||||||
- Check code if there are deprecated features that can be removed.
|
- Check code if there are deprecated features that can be removed.
|
||||||
- Generate apidiff and check if breaking changes can be prevented. Update moxtools.
|
- Generate apidiff and check if breaking changes can be prevented. Update moxtools.
|
||||||
- Update features & roadmap in README.md
|
- Update features & roadmap in README.md
|
||||||
- Write release notes.
|
- Write release notes, copy from previous.
|
||||||
- Build and run tests with previous major Go release.
|
- Build and run tests with previous major Go release.
|
||||||
- Run tests, including with race detector.
|
- Run tests, including with race detector.
|
||||||
- Run integration and upgrade tests.
|
- Run integration and upgrade tests.
|
||||||
|
@ -320,7 +320,7 @@ done
|
||||||
- Check with https://internet.nl.
|
- Check with https://internet.nl.
|
||||||
- Move apidiff/next.txt to apidiff/<version>.txt, and create empty next.txt.
|
- Move apidiff/next.txt to apidiff/<version>.txt, and create empty next.txt.
|
||||||
- Add release to the Latest release & News sections of website/index.md.
|
- Add release to the Latest release & News sections of website/index.md.
|
||||||
- Create git tag, push code.
|
- Create git tag (note: "#" is comment, not title/header), push code.
|
||||||
- Publish new docker image.
|
- Publish new docker image.
|
||||||
- Publish signed release notes for updates.xmox.nl and update DNS record.
|
- Publish signed release notes for updates.xmox.nl and update DNS record.
|
||||||
- Deploy update to website.
|
- Deploy update to website.
|
||||||
|
|
|
@ -842,7 +842,7 @@ Period: %s - %s UTC
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
|
|
||||||
qm := queue.MakeMsg(from.Path(), rcpt.address.Path(), has8bit, smtputf8, msgSize, messageID, []byte(msgPrefix), nil, time.Now())
|
qm := queue.MakeMsg(from.Path(), rcpt.address.Path(), has8bit, smtputf8, msgSize, messageID, []byte(msgPrefix), nil, time.Now(), subject)
|
||||||
// Don't try as long as regular deliveries, and stop before we would send the
|
// Don't try as long as regular deliveries, and stop before we would send the
|
||||||
// delayed DSN. Though we also won't send that due to IsDMARCReport.
|
// delayed DSN. Though we also won't send that due to IsDMARCReport.
|
||||||
qm.MaxAttempts = 5
|
qm.MaxAttempts = 5
|
||||||
|
@ -911,7 +911,7 @@ func composeAggregateReport(ctx context.Context, log mlog.Log, mf *os.File, from
|
||||||
xc.Line()
|
xc.Line()
|
||||||
|
|
||||||
// Textual part, just mentioning this is a DMARC report.
|
// Textual part, just mentioning this is a DMARC report.
|
||||||
textBody, ct, cte := xc.TextPart(text)
|
textBody, ct, cte := xc.TextPart("plain", text)
|
||||||
textHdr := textproto.MIMEHeader{}
|
textHdr := textproto.MIMEHeader{}
|
||||||
textHdr.Set("Content-Type", ct)
|
textHdr.Set("Content-Type", ct)
|
||||||
textHdr.Set("Content-Transfer-Encoding", cte)
|
textHdr.Set("Content-Transfer-Encoding", cte)
|
||||||
|
@ -997,7 +997,7 @@ Submitting-URI: %s
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
|
|
||||||
qm := queue.MakeMsg(fromAddr.Path(), rcpt.Address.Path(), has8bit, smtputf8, msgSize, messageID, []byte(msgPrefix), nil, time.Now())
|
qm := queue.MakeMsg(fromAddr.Path(), rcpt.Address.Path(), has8bit, smtputf8, msgSize, messageID, []byte(msgPrefix), nil, time.Now(), subject)
|
||||||
// Don't try as long as regular deliveries, and stop before we would send the
|
// Don't try as long as regular deliveries, and stop before we would send the
|
||||||
// delayed DSN. Though we also won't send that due to IsDMARCReport.
|
// delayed DSN. Though we also won't send that due to IsDMARCReport.
|
||||||
qm.MaxAttempts = 5
|
qm.MaxAttempts = 5
|
||||||
|
@ -1045,7 +1045,7 @@ func composeErrorReport(ctx context.Context, log mlog.Log, mf *os.File, fromAddr
|
||||||
xc.Header("User-Agent", "mox/"+moxvar.Version)
|
xc.Header("User-Agent", "mox/"+moxvar.Version)
|
||||||
xc.Header("MIME-Version", "1.0")
|
xc.Header("MIME-Version", "1.0")
|
||||||
|
|
||||||
textBody, ct, cte := xc.TextPart(text)
|
textBody, ct, cte := xc.TextPart("plain", text)
|
||||||
xc.Header("Content-Type", ct)
|
xc.Header("Content-Type", ct)
|
||||||
xc.Header("Content-Transfer-Encoding", cte)
|
xc.Header("Content-Transfer-Encoding", cte)
|
||||||
xc.Line()
|
xc.Line()
|
||||||
|
|
250
doc.go
250
doc.go
|
@ -28,15 +28,27 @@ any parameters. Followed by the help and usage information for each command.
|
||||||
mox queue holdrules list
|
mox queue holdrules list
|
||||||
mox queue holdrules add [ruleflags]
|
mox queue holdrules add [ruleflags]
|
||||||
mox queue holdrules remove ruleid
|
mox queue holdrules remove ruleid
|
||||||
mox queue list [filterflags]
|
mox queue list [filtersortflags]
|
||||||
mox queue hold [filterflags]
|
mox queue hold [filterflags]
|
||||||
mox queue unhold [filterflags]
|
mox queue unhold [filterflags]
|
||||||
mox queue schedule [filterflags] duration
|
mox queue schedule [filterflags] [-now] duration
|
||||||
mox queue transport [filterflags] transport
|
mox queue transport [filterflags] transport
|
||||||
mox queue requiretls [filterflags] {yes | no | default}
|
mox queue requiretls [filterflags] {yes | no | default}
|
||||||
mox queue fail [filterflags]
|
mox queue fail [filterflags]
|
||||||
mox queue drop [filterflags]
|
mox queue drop [filterflags]
|
||||||
mox queue dump id
|
mox queue dump id
|
||||||
|
mox queue retired list [filtersortflags]
|
||||||
|
mox queue retired print id
|
||||||
|
mox queue suppress list [-account account]
|
||||||
|
mox queue suppress add account address
|
||||||
|
mox queue suppress remove account address
|
||||||
|
mox queue suppress lookup [-account account] address
|
||||||
|
mox queue webhook list [filtersortflags]
|
||||||
|
mox queue webhook schedule [filterflags] duration
|
||||||
|
mox queue webhook cancel [filterflags]
|
||||||
|
mox queue webhook print id
|
||||||
|
mox queue webhook retired list [filtersortflags]
|
||||||
|
mox queue webhook retired print id
|
||||||
mox import maildir accountname mailboxname maildir
|
mox import maildir accountname mailboxname maildir
|
||||||
mox import mbox accountname mailboxname mbox
|
mox import mbox accountname mailboxname mbox
|
||||||
mox export maildir dst-dir account-path [mailbox]
|
mox export maildir dst-dir account-path [mailbox]
|
||||||
|
@ -59,7 +71,7 @@ any parameters. Followed by the help and usage information for each command.
|
||||||
mox config describe-sendmail >/etc/moxsubmit.conf
|
mox config describe-sendmail >/etc/moxsubmit.conf
|
||||||
mox config printservice >mox.service
|
mox config printservice >mox.service
|
||||||
mox config ensureacmehostprivatekeys
|
mox config ensureacmehostprivatekeys
|
||||||
mox example [name]
|
mox config example [name]
|
||||||
mox checkupdate
|
mox checkupdate
|
||||||
mox cid cid
|
mox cid cid
|
||||||
mox clientconfig domain
|
mox clientconfig domain
|
||||||
|
@ -88,6 +100,8 @@ any parameters. Followed by the help and usage information for each command.
|
||||||
mox tlsrpt lookup domain
|
mox tlsrpt lookup domain
|
||||||
mox tlsrpt parsereportmsg message ...
|
mox tlsrpt parsereportmsg message ...
|
||||||
mox version
|
mox version
|
||||||
|
mox webapi [method [baseurl-with-credentials]
|
||||||
|
mox example [name]
|
||||||
mox bumpuidvalidity account [mailbox]
|
mox bumpuidvalidity account [mailbox]
|
||||||
mox reassignuids account [mailboxid]
|
mox reassignuids account [mailboxid]
|
||||||
mox fixuidmeta account
|
mox fixuidmeta account
|
||||||
|
@ -143,8 +157,8 @@ domains with HTTP/HTTPS, including with automatic TLS with ACME, is easily
|
||||||
configured through both configuration files and admin web interface, and can act
|
configured through both configuration files and admin web interface, and can act
|
||||||
as a reverse proxy (and static file server for that matter), so you can forward
|
as a reverse proxy (and static file server for that matter), so you can forward
|
||||||
traffic to your existing backend applications. Look for "WebHandlers:" in the
|
traffic to your existing backend applications. Look for "WebHandlers:" in the
|
||||||
output of "mox config describe-domains" and see the output of "mox example
|
output of "mox config describe-domains" and see the output of
|
||||||
webhandlers".
|
"mox config example webhandlers".
|
||||||
|
|
||||||
usage: mox quickstart [-skipdial] [-existing-webserver] [-hostname host] user@domain [user | uid]
|
usage: mox quickstart [-skipdial] [-existing-webserver] [-hostname host] user@domain [user | uid]
|
||||||
-existing-webserver
|
-existing-webserver
|
||||||
|
@ -244,17 +258,23 @@ List matching messages in the delivery queue.
|
||||||
|
|
||||||
Prints the message with its ID, last and next delivery attempts, last error.
|
Prints the message with its ID, last and next delivery attempts, last error.
|
||||||
|
|
||||||
usage: mox queue list [filterflags]
|
usage: mox queue list [filtersortflags]
|
||||||
-account string
|
-account string
|
||||||
account that queued the message
|
account that queued the message
|
||||||
|
-asc
|
||||||
|
sort ascending instead of descending (default)
|
||||||
-from string
|
-from string
|
||||||
from address of message, use "@example.com" to match all messages for a domain
|
from address of message, use "@example.com" to match all messages for a domain
|
||||||
-hold value
|
-hold value
|
||||||
true or false, whether to match only messages that are (not) on hold
|
true or false, whether to match only messages that are (not) on hold
|
||||||
-ids value
|
-ids value
|
||||||
comma-separated list of message IDs
|
comma-separated list of message IDs
|
||||||
|
-n int
|
||||||
|
number of messages to return
|
||||||
-nextattempt string
|
-nextattempt string
|
||||||
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
|
-sort value
|
||||||
|
field to sort by, "nextattempt" (default) or "queued"
|
||||||
-submitted string
|
-submitted string
|
||||||
filter by time of submission relative to now, value must start with "<" (before now) or ">" (after now)
|
filter by time of submission relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
-to string
|
-to string
|
||||||
|
@ -278,6 +298,8 @@ otherwise handled by the admin.
|
||||||
true or false, whether to match only messages that are (not) on hold
|
true or false, whether to match only messages that are (not) on hold
|
||||||
-ids value
|
-ids value
|
||||||
comma-separated list of message IDs
|
comma-separated list of message IDs
|
||||||
|
-n int
|
||||||
|
number of messages to return
|
||||||
-nextattempt string
|
-nextattempt string
|
||||||
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
-submitted string
|
-submitted string
|
||||||
|
@ -303,6 +325,8 @@ delivery attempt. See the "queue schedule" command.
|
||||||
true or false, whether to match only messages that are (not) on hold
|
true or false, whether to match only messages that are (not) on hold
|
||||||
-ids value
|
-ids value
|
||||||
comma-separated list of message IDs
|
comma-separated list of message IDs
|
||||||
|
-n int
|
||||||
|
number of messages to return
|
||||||
-nextattempt string
|
-nextattempt string
|
||||||
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
-submitted string
|
-submitted string
|
||||||
|
@ -322,7 +346,7 @@ current time, instead of added to the current scheduled time.
|
||||||
|
|
||||||
Schedule immediate delivery with "mox queue schedule -now 0".
|
Schedule immediate delivery with "mox queue schedule -now 0".
|
||||||
|
|
||||||
usage: mox queue schedule [filterflags] duration
|
usage: mox queue schedule [filterflags] [-now] duration
|
||||||
-account string
|
-account string
|
||||||
account that queued the message
|
account that queued the message
|
||||||
-from string
|
-from string
|
||||||
|
@ -331,6 +355,8 @@ Schedule immediate delivery with "mox queue schedule -now 0".
|
||||||
true or false, whether to match only messages that are (not) on hold
|
true or false, whether to match only messages that are (not) on hold
|
||||||
-ids value
|
-ids value
|
||||||
comma-separated list of message IDs
|
comma-separated list of message IDs
|
||||||
|
-n int
|
||||||
|
number of messages to return
|
||||||
-nextattempt string
|
-nextattempt string
|
||||||
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
-now
|
-now
|
||||||
|
@ -360,6 +386,8 @@ another mail server or with connections over a SOCKS proxy.
|
||||||
true or false, whether to match only messages that are (not) on hold
|
true or false, whether to match only messages that are (not) on hold
|
||||||
-ids value
|
-ids value
|
||||||
comma-separated list of message IDs
|
comma-separated list of message IDs
|
||||||
|
-n int
|
||||||
|
number of messages to return
|
||||||
-nextattempt string
|
-nextattempt string
|
||||||
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
-submitted string
|
-submitted string
|
||||||
|
@ -392,6 +420,8 @@ TLS.
|
||||||
true or false, whether to match only messages that are (not) on hold
|
true or false, whether to match only messages that are (not) on hold
|
||||||
-ids value
|
-ids value
|
||||||
comma-separated list of message IDs
|
comma-separated list of message IDs
|
||||||
|
-n int
|
||||||
|
number of messages to return
|
||||||
-nextattempt string
|
-nextattempt string
|
||||||
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
-submitted string
|
-submitted string
|
||||||
|
@ -418,6 +448,8 @@ contains a line saying the message was canceled by the admin.
|
||||||
true or false, whether to match only messages that are (not) on hold
|
true or false, whether to match only messages that are (not) on hold
|
||||||
-ids value
|
-ids value
|
||||||
comma-separated list of message IDs
|
comma-separated list of message IDs
|
||||||
|
-n int
|
||||||
|
number of messages to return
|
||||||
-nextattempt string
|
-nextattempt string
|
||||||
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
-submitted string
|
-submitted string
|
||||||
|
@ -443,6 +475,8 @@ the message, use "queue dump" before removing.
|
||||||
true or false, whether to match only messages that are (not) on hold
|
true or false, whether to match only messages that are (not) on hold
|
||||||
-ids value
|
-ids value
|
||||||
comma-separated list of message IDs
|
comma-separated list of message IDs
|
||||||
|
-n int
|
||||||
|
number of messages to return
|
||||||
-nextattempt string
|
-nextattempt string
|
||||||
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
-submitted string
|
-submitted string
|
||||||
|
@ -460,6 +494,180 @@ The message is printed to stdout and is in standard internet mail format.
|
||||||
|
|
||||||
usage: mox queue dump id
|
usage: mox queue dump id
|
||||||
|
|
||||||
|
# mox queue retired list
|
||||||
|
|
||||||
|
List matching messages in the retired queue.
|
||||||
|
|
||||||
|
Prints messages with their ID and results.
|
||||||
|
|
||||||
|
usage: mox queue retired list [filtersortflags]
|
||||||
|
-account string
|
||||||
|
account that queued the message
|
||||||
|
-asc
|
||||||
|
sort ascending instead of descending (default)
|
||||||
|
-from string
|
||||||
|
from address of message, use "@example.com" to match all messages for a domain
|
||||||
|
-ids value
|
||||||
|
comma-separated list of retired message IDs
|
||||||
|
-lastactivity string
|
||||||
|
filter by time of last activity relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
|
-n int
|
||||||
|
number of messages to return
|
||||||
|
-result value
|
||||||
|
"success" or "failure" as result of delivery
|
||||||
|
-sort value
|
||||||
|
field to sort by, "lastactivity" (default) or "queued"
|
||||||
|
-submitted string
|
||||||
|
filter by time of submission relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
|
-to string
|
||||||
|
recipient address of message, use "@example.com" to match all messages for a domain
|
||||||
|
-transport value
|
||||||
|
transport to use for messages, empty string sets the default behaviour
|
||||||
|
|
||||||
|
# mox queue retired print
|
||||||
|
|
||||||
|
Print a message from the retired queue.
|
||||||
|
|
||||||
|
Prints a JSON representation of the information from the retired queue.
|
||||||
|
|
||||||
|
usage: mox queue retired print id
|
||||||
|
|
||||||
|
# mox queue suppress list
|
||||||
|
|
||||||
|
Print addresses in suppression list.
|
||||||
|
|
||||||
|
usage: mox queue suppress list [-account account]
|
||||||
|
-account string
|
||||||
|
only show suppression list for this account
|
||||||
|
|
||||||
|
# mox queue suppress add
|
||||||
|
|
||||||
|
Add address to suppression list for account.
|
||||||
|
|
||||||
|
usage: mox queue suppress add account address
|
||||||
|
|
||||||
|
# mox queue suppress remove
|
||||||
|
|
||||||
|
Remove address from suppression list for account.
|
||||||
|
|
||||||
|
usage: mox queue suppress remove account address
|
||||||
|
|
||||||
|
# mox queue suppress lookup
|
||||||
|
|
||||||
|
Check if address is present in suppression list, for any or specific account.
|
||||||
|
|
||||||
|
usage: mox queue suppress lookup [-account account] address
|
||||||
|
-account string
|
||||||
|
only check address in specified account
|
||||||
|
|
||||||
|
# mox queue webhook list
|
||||||
|
|
||||||
|
List matching webhooks in the queue.
|
||||||
|
|
||||||
|
Prints list of webhooks, their IDs and basic information.
|
||||||
|
|
||||||
|
usage: mox queue webhook list [filtersortflags]
|
||||||
|
-account string
|
||||||
|
account that queued the message/webhook
|
||||||
|
-asc
|
||||||
|
sort ascending instead of descending (default)
|
||||||
|
-event value
|
||||||
|
event this webhook is about: incoming, delivered, suppressed, delayed, failed, relayed, expanded, canceled, unrecognized
|
||||||
|
-ids value
|
||||||
|
comma-separated list of webhook IDs
|
||||||
|
-n int
|
||||||
|
number of webhooks to return
|
||||||
|
-nextattempt string
|
||||||
|
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
|
-sort value
|
||||||
|
field to sort by, "nextattempt" (default) or "queued"
|
||||||
|
-submitted string
|
||||||
|
filter by time of submission relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
|
|
||||||
|
# mox queue webhook schedule
|
||||||
|
|
||||||
|
Change next delivery attempt for matching webhooks.
|
||||||
|
|
||||||
|
The next delivery attempt is adjusted by the duration parameter. If the -now
|
||||||
|
flag is set, the new delivery attempt is set to the duration added to the
|
||||||
|
current time, instead of added to the current scheduled time.
|
||||||
|
|
||||||
|
Schedule immediate delivery with "mox queue schedule -now 0".
|
||||||
|
|
||||||
|
usage: mox queue webhook schedule [filterflags] duration
|
||||||
|
-account string
|
||||||
|
account that queued the message/webhook
|
||||||
|
-event value
|
||||||
|
event this webhook is about: incoming, delivered, suppressed, delayed, failed, relayed, expanded, canceled, unrecognized
|
||||||
|
-ids value
|
||||||
|
comma-separated list of webhook IDs
|
||||||
|
-n int
|
||||||
|
number of webhooks to return
|
||||||
|
-nextattempt string
|
||||||
|
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
|
-now
|
||||||
|
schedule for duration relative to current time instead of relative to current next delivery attempt for webhooks
|
||||||
|
-submitted string
|
||||||
|
filter by time of submission relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
|
|
||||||
|
# mox queue webhook cancel
|
||||||
|
|
||||||
|
Fail delivery of matching webhooks.
|
||||||
|
|
||||||
|
usage: mox queue webhook cancel [filterflags]
|
||||||
|
-account string
|
||||||
|
account that queued the message/webhook
|
||||||
|
-event value
|
||||||
|
event this webhook is about: incoming, delivered, suppressed, delayed, failed, relayed, expanded, canceled, unrecognized
|
||||||
|
-ids value
|
||||||
|
comma-separated list of webhook IDs
|
||||||
|
-n int
|
||||||
|
number of webhooks to return
|
||||||
|
-nextattempt string
|
||||||
|
filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
|
-submitted string
|
||||||
|
filter by time of submission relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
|
|
||||||
|
# mox queue webhook print
|
||||||
|
|
||||||
|
Print details of a webhook from the queue.
|
||||||
|
|
||||||
|
The webhook is printed to stdout as JSON.
|
||||||
|
|
||||||
|
usage: mox queue webhook print id
|
||||||
|
|
||||||
|
# mox queue webhook retired list
|
||||||
|
|
||||||
|
List matching webhooks in the retired queue.
|
||||||
|
|
||||||
|
Prints list of retired webhooks, their IDs and basic information.
|
||||||
|
|
||||||
|
usage: mox queue webhook retired list [filtersortflags]
|
||||||
|
-account string
|
||||||
|
account that queued the message/webhook
|
||||||
|
-asc
|
||||||
|
sort ascending instead of descending (default)
|
||||||
|
-event value
|
||||||
|
event this webhook is about: incoming, delivered, suppressed, delayed, failed, relayed, expanded, canceled, unrecognized
|
||||||
|
-ids value
|
||||||
|
comma-separated list of retired webhook IDs
|
||||||
|
-lastactivity string
|
||||||
|
filter by time of last activity relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
|
-n int
|
||||||
|
number of webhooks to return
|
||||||
|
-sort value
|
||||||
|
field to sort by, "lastactivity" (default) or "queued"
|
||||||
|
-submitted string
|
||||||
|
filter by time of submission relative to now, value must start with "<" (before now) or ">" (after now)
|
||||||
|
|
||||||
|
# mox queue webhook retired print
|
||||||
|
|
||||||
|
Print details of a webhook from the retired queue.
|
||||||
|
|
||||||
|
The retired webhook is printed to stdout as JSON.
|
||||||
|
|
||||||
|
usage: mox queue webhook retired print id
|
||||||
|
|
||||||
# mox import maildir
|
# mox import maildir
|
||||||
|
|
||||||
Import a maildir into an account.
|
Import a maildir into an account.
|
||||||
|
@ -552,8 +760,9 @@ automatically initialized with configuration files, an account with email
|
||||||
address mox@localhost and password moxmoxmox, and a newly generated self-signed
|
address mox@localhost and password moxmoxmox, and a newly generated self-signed
|
||||||
TLS certificate.
|
TLS certificate.
|
||||||
|
|
||||||
All incoming email to any address is accepted (if checks pass), unless the
|
All incoming email to any address is accepted (if checks pass) and delivered to
|
||||||
recipient localpart ends with:
|
the account that is submitting the message, unless the recipient localpart ends
|
||||||
|
with:
|
||||||
|
|
||||||
- "temperror": fail with a temporary error code
|
- "temperror": fail with a temporary error code
|
||||||
- "permerror": fail with a permanent error code
|
- "permerror": fail with a permanent error code
|
||||||
|
@ -561,7 +770,8 @@ recipient localpart ends with:
|
||||||
- "timeout": no response (for an hour)
|
- "timeout": no response (for an hour)
|
||||||
|
|
||||||
If the localpart begins with "mailfrom" or "rcptto", the error is returned
|
If the localpart begins with "mailfrom" or "rcptto", the error is returned
|
||||||
during those commands instead of during "data".
|
during those commands instead of during "data". If the localpart beings with
|
||||||
|
"queue", the submission is accepted but delivery from the queue will fail.
|
||||||
|
|
||||||
usage: mox localserve
|
usage: mox localserve
|
||||||
-dir string
|
-dir string
|
||||||
|
@ -793,11 +1003,11 @@ for a domain and create the TLSA DNS records it suggests to enable DANE.
|
||||||
|
|
||||||
usage: mox config ensureacmehostprivatekeys
|
usage: mox config ensureacmehostprivatekeys
|
||||||
|
|
||||||
# mox example
|
# mox config example
|
||||||
|
|
||||||
List available examples, or print a specific example.
|
List available config examples, or print a specific example.
|
||||||
|
|
||||||
usage: mox example [name]
|
usage: mox config example [name]
|
||||||
|
|
||||||
# mox checkupdate
|
# mox checkupdate
|
||||||
|
|
||||||
|
@ -1128,6 +1338,18 @@ Prints this mox version.
|
||||||
|
|
||||||
usage: mox version
|
usage: mox version
|
||||||
|
|
||||||
|
# mox webapi
|
||||||
|
|
||||||
|
Lists available methods, prints request/response parameters for method, or calls a method with a request read from standard input.
|
||||||
|
|
||||||
|
usage: mox webapi [method [baseurl-with-credentials]
|
||||||
|
|
||||||
|
# mox example
|
||||||
|
|
||||||
|
List available examples, or print a specific example.
|
||||||
|
|
||||||
|
usage: mox example [name]
|
||||||
|
|
||||||
# mox bumpuidvalidity
|
# mox bumpuidvalidity
|
||||||
|
|
||||||
Change the IMAP UID validity of the mailbox, causing IMAP clients to refetch messages.
|
Change the IMAP UID validity of the mailbox, causing IMAP clients to refetch messages.
|
||||||
|
@ -1212,6 +1434,8 @@ and print them.
|
||||||
Parse message, print JSON representation.
|
Parse message, print JSON representation.
|
||||||
|
|
||||||
usage: mox message parse message.eml
|
usage: mox message parse message.eml
|
||||||
|
-smtputf8
|
||||||
|
check if message needs smtputf8
|
||||||
|
|
||||||
# mox reassignthreads
|
# mox reassignthreads
|
||||||
|
|
||||||
|
|
10
dsn/dsn.go
10
dsn/dsn.go
|
@ -114,9 +114,9 @@ type Recipient struct {
|
||||||
// deliveries.
|
// deliveries.
|
||||||
RemoteMTA NameIP
|
RemoteMTA NameIP
|
||||||
|
|
||||||
// DiagnosticCode should either be empty, or start with "smtp; " followed by the
|
// DiagnosticCodeSMTP are the full SMTP response lines, space separated. The marshaled
|
||||||
// literal full SMTP response lines, space separated.
|
// form starts with "smtp; ", this value does not.
|
||||||
DiagnosticCode string
|
DiagnosticCodeSMTP string
|
||||||
|
|
||||||
LastAttemptDate time.Time
|
LastAttemptDate time.Time
|
||||||
FinalLogID string
|
FinalLogID string
|
||||||
|
@ -286,9 +286,9 @@ func (m *Message) Compose(log mlog.Log, smtputf8 bool) ([]byte, error) {
|
||||||
status("Remote-MTA", s)
|
status("Remote-MTA", s)
|
||||||
}
|
}
|
||||||
// Presence of Diagnostic-Code indicates the code is from Remote-MTA. ../rfc/3464:1053
|
// Presence of Diagnostic-Code indicates the code is from Remote-MTA. ../rfc/3464:1053
|
||||||
if r.DiagnosticCode != "" {
|
if r.DiagnosticCodeSMTP != "" {
|
||||||
// ../rfc/3461:1342 ../rfc/6533:589
|
// ../rfc/3461:1342 ../rfc/6533:589
|
||||||
status("Diagnostic-Code", r.DiagnosticCode)
|
status("Diagnostic-Code", "smtp; "+r.DiagnosticCodeSMTP)
|
||||||
}
|
}
|
||||||
if !r.LastAttemptDate.IsZero() {
|
if !r.LastAttemptDate.IsZero() {
|
||||||
status("Last-Attempt-Date", r.LastAttemptDate.Format(message.RFC5322Z)) // ../rfc/3464:1076
|
status("Last-Attempt-Date", r.LastAttemptDate.Format(message.RFC5322Z)) // ../rfc/3464:1076
|
||||||
|
|
|
@ -249,7 +249,7 @@ func parseRecipientHeader(mr *textproto.Reader, utf8 bool) (Recipient, error) {
|
||||||
} else if len(t) != 2 {
|
} else if len(t) != 2 {
|
||||||
err = fmt.Errorf("missing semicolon to separate diagnostic-type from code")
|
err = fmt.Errorf("missing semicolon to separate diagnostic-type from code")
|
||||||
} else {
|
} else {
|
||||||
r.DiagnosticCode = strings.TrimSpace(t[1])
|
r.DiagnosticCodeSMTP = strings.TrimSpace(t[1])
|
||||||
}
|
}
|
||||||
case "Last-Attempt-Date":
|
case "Last-Attempt-Date":
|
||||||
r.LastAttemptDate, err = parseDateTime(v)
|
r.LastAttemptDate, err = parseDateTime(v)
|
||||||
|
|
130
examples.go
130
examples.go
|
@ -1,13 +1,21 @@
|
||||||
package main
|
package main
|
||||||
|
|
||||||
import (
|
import (
|
||||||
|
"bytes"
|
||||||
|
"encoding/base64"
|
||||||
|
"encoding/json"
|
||||||
"fmt"
|
"fmt"
|
||||||
"log"
|
"log"
|
||||||
|
"reflect"
|
||||||
"strings"
|
"strings"
|
||||||
|
"time"
|
||||||
|
|
||||||
"github.com/mjl-/sconf"
|
"github.com/mjl-/sconf"
|
||||||
|
|
||||||
"github.com/mjl-/mox/config"
|
"github.com/mjl-/mox/config"
|
||||||
|
"github.com/mjl-/mox/mox-"
|
||||||
|
"github.com/mjl-/mox/smtp"
|
||||||
|
"github.com/mjl-/mox/webhook"
|
||||||
)
|
)
|
||||||
|
|
||||||
func cmdExample(c *cmd) {
|
func cmdExample(c *cmd) {
|
||||||
|
@ -36,7 +44,33 @@ func cmdExample(c *cmd) {
|
||||||
fmt.Print(match())
|
fmt.Print(match())
|
||||||
}
|
}
|
||||||
|
|
||||||
var examples = []struct {
|
func cmdConfigExample(c *cmd) {
|
||||||
|
c.params = "[name]"
|
||||||
|
c.help = `List available config examples, or print a specific example.`
|
||||||
|
|
||||||
|
args := c.Parse()
|
||||||
|
if len(args) > 1 {
|
||||||
|
c.Usage()
|
||||||
|
}
|
||||||
|
|
||||||
|
var match func() string
|
||||||
|
for _, ex := range configExamples {
|
||||||
|
if len(args) == 0 {
|
||||||
|
fmt.Println(ex.Name)
|
||||||
|
} else if args[0] == ex.Name {
|
||||||
|
match = ex.Get
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if len(args) == 0 {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if match == nil {
|
||||||
|
log.Fatalln("not found")
|
||||||
|
}
|
||||||
|
fmt.Print(match())
|
||||||
|
}
|
||||||
|
|
||||||
|
var configExamples = []struct {
|
||||||
Name string
|
Name string
|
||||||
Get func() string
|
Get func() string
|
||||||
}{
|
}{
|
||||||
|
@ -195,3 +229,97 @@ Routes:
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
var exampleTime = time.Date(2024, time.March, 27, 0, 0, 0, 0, time.UTC)
|
||||||
|
|
||||||
|
var examples = []struct {
|
||||||
|
Name string
|
||||||
|
Get func() string
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
"webhook-outgoing-delivered",
|
||||||
|
func() string {
|
||||||
|
v := webhook.Outgoing{
|
||||||
|
Version: 0,
|
||||||
|
Event: webhook.EventDelivered,
|
||||||
|
QueueMsgID: 101,
|
||||||
|
FromID: base64.RawURLEncoding.EncodeToString([]byte("0123456789abcdef")),
|
||||||
|
MessageID: "<QnxzgulZK51utga6agH_rg@mox.example>",
|
||||||
|
Subject: "subject of original message",
|
||||||
|
WebhookQueued: exampleTime,
|
||||||
|
Extra: map[string]string{},
|
||||||
|
SMTPCode: smtp.C250Completed,
|
||||||
|
}
|
||||||
|
return "Example webhook HTTP POST JSON body for successful outgoing delivery:\n\n\t" + formatJSON(v)
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"webhook-outgoing-dsn-failed",
|
||||||
|
func() string {
|
||||||
|
v := webhook.Outgoing{
|
||||||
|
Version: 0,
|
||||||
|
Event: webhook.EventFailed,
|
||||||
|
DSN: true,
|
||||||
|
Suppressing: true,
|
||||||
|
QueueMsgID: 102,
|
||||||
|
FromID: base64.RawURLEncoding.EncodeToString([]byte("0123456789abcdef")),
|
||||||
|
MessageID: "<QnxzgulZK51utga6agH_rg@mox.example>",
|
||||||
|
Subject: "subject of original message",
|
||||||
|
WebhookQueued: exampleTime,
|
||||||
|
Extra: map[string]string{"userid": "456"},
|
||||||
|
Error: "timeout connecting to host",
|
||||||
|
SMTPCode: smtp.C554TransactionFailed,
|
||||||
|
SMTPEnhancedCode: "5." + smtp.SeNet4Other0,
|
||||||
|
}
|
||||||
|
return `Example webhook HTTP POST JSON body for failed delivery based on incoming DSN
|
||||||
|
message, with custom extra data fields (from original submission), and adding address to the suppression list:
|
||||||
|
|
||||||
|
` + formatJSON(v)
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"webhook-incoming-basic",
|
||||||
|
func() string {
|
||||||
|
v := webhook.Incoming{
|
||||||
|
Version: 0,
|
||||||
|
From: []webhook.NameAddress{{Address: "mox@localhost"}},
|
||||||
|
To: []webhook.NameAddress{{Address: "mjl@localhost"}},
|
||||||
|
Subject: "hi",
|
||||||
|
MessageID: "<QnxzgulZK51utga6agH_rg@mox.example>",
|
||||||
|
Date: &exampleTime,
|
||||||
|
Text: "hello world ☺\n",
|
||||||
|
Structure: webhook.Structure{
|
||||||
|
ContentType: "text/plain",
|
||||||
|
ContentTypeParams: map[string]string{"charset": "utf-8"},
|
||||||
|
DecodedSize: int64(len("hello world ☺\r\n")),
|
||||||
|
Parts: []webhook.Structure{},
|
||||||
|
},
|
||||||
|
Meta: webhook.IncomingMeta{
|
||||||
|
MsgID: 201,
|
||||||
|
MailFrom: "mox@localhost",
|
||||||
|
MailFromValidated: false,
|
||||||
|
MsgFromValidated: true,
|
||||||
|
RcptTo: "mjl@localhost",
|
||||||
|
DKIMVerifiedDomains: []string{"localhost"},
|
||||||
|
RemoteIP: "127.0.0.1",
|
||||||
|
Received: exampleTime.Add(3 * time.Second),
|
||||||
|
MailboxName: "Inbox",
|
||||||
|
Automated: false,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
return "Example JSON body for webhooks for incoming delivery of basic message:\n\n\t" + formatJSON(v)
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
func formatJSON(v any) string {
|
||||||
|
nv, _ := mox.FillNil(reflect.ValueOf(v))
|
||||||
|
v = nv.Interface()
|
||||||
|
var b bytes.Buffer
|
||||||
|
enc := json.NewEncoder(&b)
|
||||||
|
enc.SetIndent("\t", "\t")
|
||||||
|
enc.SetEscapeHTML(false)
|
||||||
|
err := enc.Encode(v)
|
||||||
|
xcheckf(err, "encoding to json")
|
||||||
|
return b.String()
|
||||||
|
}
|
||||||
|
|
14
gendoc.sh
14
gendoc.sh
|
@ -1,5 +1,6 @@
|
||||||
#!/usr/bin/env sh
|
#!/usr/bin/env sh
|
||||||
|
|
||||||
|
# ./doc.go
|
||||||
(
|
(
|
||||||
cat <<EOF
|
cat <<EOF
|
||||||
/*
|
/*
|
||||||
|
@ -38,6 +39,7 @@ EOF
|
||||||
)>doc.go
|
)>doc.go
|
||||||
gofmt -w doc.go
|
gofmt -w doc.go
|
||||||
|
|
||||||
|
# ./config/doc.go
|
||||||
(
|
(
|
||||||
cat <<EOF
|
cat <<EOF
|
||||||
/*
|
/*
|
||||||
|
@ -92,15 +94,15 @@ cat <<EOF
|
||||||
# Examples
|
# Examples
|
||||||
|
|
||||||
Mox includes configuration files to illustrate common setups. You can see these
|
Mox includes configuration files to illustrate common setups. You can see these
|
||||||
examples with "mox example", and print a specific example with "mox example
|
examples with "mox config example", and print a specific example with "mox
|
||||||
<name>". Below are all examples included in mox.
|
config example <name>". Below are all examples included in mox.
|
||||||
|
|
||||||
EOF
|
EOF
|
||||||
|
|
||||||
for ex in $(./mox example); do
|
for ex in $(./mox config example); do
|
||||||
echo '# Example '$ex
|
echo '# Example '$ex
|
||||||
echo
|
echo
|
||||||
./mox example $ex | sed 's/^/\t/'
|
./mox config example $ex | sed 's/^/\t/'
|
||||||
echo
|
echo
|
||||||
done
|
done
|
||||||
|
|
||||||
|
@ -112,3 +114,7 @@ package config
|
||||||
EOF
|
EOF
|
||||||
)>config/doc.go
|
)>config/doc.go
|
||||||
gofmt -w config/doc.go
|
gofmt -w config/doc.go
|
||||||
|
|
||||||
|
# ./webapi/doc.go
|
||||||
|
./webapi/gendoc.sh >webapi/doc.go
|
||||||
|
gofmt -w webapi/doc.go
|
||||||
|
|
|
@ -233,7 +233,7 @@ Accounts:
|
||||||
const qmsg = "From: <test0@mox.example>\r\nTo: <other@remote.example>\r\nSubject: test\r\n\r\nthe message...\r\n"
|
const qmsg = "From: <test0@mox.example>\r\nTo: <other@remote.example>\r\nSubject: test\r\n\r\nthe message...\r\n"
|
||||||
_, err = fmt.Fprint(mf, qmsg)
|
_, err = fmt.Fprint(mf, qmsg)
|
||||||
xcheckf(err, "writing message")
|
xcheckf(err, "writing message")
|
||||||
qm := queue.MakeMsg(mailfrom, rcptto, false, false, int64(len(qmsg)), "<test@localhost>", prefix, nil, time.Now())
|
qm := queue.MakeMsg(mailfrom, rcptto, false, false, int64(len(qmsg)), "<test@localhost>", prefix, nil, time.Now(), "test")
|
||||||
err = queue.Add(ctxbg, c.log, "test0", mf, qm)
|
err = queue.Add(ctxbg, c.log, "test0", mf, qm)
|
||||||
xcheckf(err, "enqueue message")
|
xcheckf(err, "enqueue message")
|
||||||
|
|
||||||
|
|
25
http/web.go
25
http/web.go
|
@ -35,6 +35,7 @@ import (
|
||||||
"github.com/mjl-/mox/ratelimit"
|
"github.com/mjl-/mox/ratelimit"
|
||||||
"github.com/mjl-/mox/webaccount"
|
"github.com/mjl-/mox/webaccount"
|
||||||
"github.com/mjl-/mox/webadmin"
|
"github.com/mjl-/mox/webadmin"
|
||||||
|
"github.com/mjl-/mox/webapisrv"
|
||||||
"github.com/mjl-/mox/webmail"
|
"github.com/mjl-/mox/webmail"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -577,6 +578,30 @@ func Listen() {
|
||||||
if maxMsgSize == 0 {
|
if maxMsgSize == 0 {
|
||||||
maxMsgSize = config.DefaultMaxMsgSize
|
maxMsgSize = config.DefaultMaxMsgSize
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if l.WebAPIHTTP.Enabled {
|
||||||
|
port := config.Port(l.WebAPIHTTP.Port, 80)
|
||||||
|
path := "/webapi/"
|
||||||
|
if l.WebAPIHTTP.Path != "" {
|
||||||
|
path = l.WebAPIHTTP.Path
|
||||||
|
}
|
||||||
|
srv := ensureServe(false, port, "webapi-http at "+path)
|
||||||
|
handler := safeHeaders(http.StripPrefix(path[:len(path)-1], webapisrv.NewServer(maxMsgSize, path, l.WebAPIHTTP.Forwarded)))
|
||||||
|
srv.Handle("webapi", nil, path, handler)
|
||||||
|
redirectToTrailingSlash(srv, "webapi", path)
|
||||||
|
}
|
||||||
|
if l.WebAPIHTTPS.Enabled {
|
||||||
|
port := config.Port(l.WebAPIHTTPS.Port, 443)
|
||||||
|
path := "/webapi/"
|
||||||
|
if l.WebAPIHTTPS.Path != "" {
|
||||||
|
path = l.WebAPIHTTPS.Path
|
||||||
|
}
|
||||||
|
srv := ensureServe(true, port, "webapi-https at "+path)
|
||||||
|
handler := safeHeaders(http.StripPrefix(path[:len(path)-1], webapisrv.NewServer(maxMsgSize, path, l.WebAPIHTTPS.Forwarded)))
|
||||||
|
srv.Handle("webapi", nil, path, handler)
|
||||||
|
redirectToTrailingSlash(srv, "webapi", path)
|
||||||
|
}
|
||||||
|
|
||||||
if l.WebmailHTTP.Enabled {
|
if l.WebmailHTTP.Enabled {
|
||||||
port := config.Port(l.WebmailHTTP.Port, 80)
|
port := config.Port(l.WebmailHTTP.Port, 80)
|
||||||
path := "/webmail/"
|
path := "/webmail/"
|
||||||
|
|
1
lib.ts
1
lib.ts
|
@ -216,6 +216,7 @@ const attr = {
|
||||||
autocomplete: (s: string) => _attr('autocomplete', s),
|
autocomplete: (s: string) => _attr('autocomplete', s),
|
||||||
list: (s: string) => _attr('list', s),
|
list: (s: string) => _attr('list', s),
|
||||||
form: (s: string) => _attr('form', s),
|
form: (s: string) => _attr('form', s),
|
||||||
|
size: (s: string) => _attr('size', s),
|
||||||
}
|
}
|
||||||
const style = (x: {[k: string]: string | number}) => { return {_styles: x}}
|
const style = (x: {[k: string]: string | number}) => { return {_styles: x}}
|
||||||
const prop = (x: {[k: string]: any}) => { return {_props: x}}
|
const prop = (x: {[k: string]: any}) => { return {_props: x}}
|
||||||
|
|
|
@ -49,8 +49,9 @@ automatically initialized with configuration files, an account with email
|
||||||
address mox@localhost and password moxmoxmox, and a newly generated self-signed
|
address mox@localhost and password moxmoxmox, and a newly generated self-signed
|
||||||
TLS certificate.
|
TLS certificate.
|
||||||
|
|
||||||
All incoming email to any address is accepted (if checks pass), unless the
|
All incoming email to any address is accepted (if checks pass) and delivered to
|
||||||
recipient localpart ends with:
|
the account that is submitting the message, unless the recipient localpart ends
|
||||||
|
with:
|
||||||
|
|
||||||
- "temperror": fail with a temporary error code
|
- "temperror": fail with a temporary error code
|
||||||
- "permerror": fail with a permanent error code
|
- "permerror": fail with a permanent error code
|
||||||
|
@ -58,7 +59,8 @@ recipient localpart ends with:
|
||||||
- "timeout": no response (for an hour)
|
- "timeout": no response (for an hour)
|
||||||
|
|
||||||
If the localpart begins with "mailfrom" or "rcptto", the error is returned
|
If the localpart begins with "mailfrom" or "rcptto", the error is returned
|
||||||
during those commands instead of during "data".
|
during those commands instead of during "data". If the localpart beings with
|
||||||
|
"queue", the submission is accepted but delivery from the queue will fail.
|
||||||
`
|
`
|
||||||
golog.SetFlags(0)
|
golog.SetFlags(0)
|
||||||
|
|
||||||
|
@ -163,7 +165,9 @@ during those commands instead of during "data".
|
||||||
golog.Printf(`- [45][0-9][0-9]: fail with the specific error code.`)
|
golog.Printf(`- [45][0-9][0-9]: fail with the specific error code.`)
|
||||||
golog.Printf(`- "timeout": no response (for an hour).`)
|
golog.Printf(`- "timeout": no response (for an hour).`)
|
||||||
golog.Print("")
|
golog.Print("")
|
||||||
golog.Printf(`if the localpart begins with "mailfrom" or "rcptto", the error is returned during those commands instead of during "data"`)
|
golog.Print(`if the localpart begins with "mailfrom" or "rcptto", the error is returned`)
|
||||||
|
golog.Print(`during those commands instead of during "data". if the localpart beings with`)
|
||||||
|
golog.Print(`"queue", the submission is accepted but delivery from the queue will fail.`)
|
||||||
golog.Print("")
|
golog.Print("")
|
||||||
golog.Print(" smtp://localhost:1025 - receive email")
|
golog.Print(" smtp://localhost:1025 - receive email")
|
||||||
golog.Print("smtps://mox%40localhost:moxmoxmox@localhost:1465 - send email")
|
golog.Print("smtps://mox%40localhost:moxmoxmox@localhost:1465 - send email")
|
||||||
|
@ -174,6 +178,8 @@ during those commands instead of during "data".
|
||||||
golog.Print(" http://localhost:1080/account/ - account http (without tls)")
|
golog.Print(" http://localhost:1080/account/ - account http (without tls)")
|
||||||
golog.Print("https://localhost:1443/webmail/ - webmail https (email mox@localhost, password moxmoxmox)")
|
golog.Print("https://localhost:1443/webmail/ - webmail https (email mox@localhost, password moxmoxmox)")
|
||||||
golog.Print(" http://localhost:1080/webmail/ - webmail http (without tls)")
|
golog.Print(" http://localhost:1080/webmail/ - webmail http (without tls)")
|
||||||
|
golog.Print("https://localhost:1443/webapi/ - webmail https (email mox@localhost, password moxmoxmox)")
|
||||||
|
golog.Print(" http://localhost:1080/webapi/ - webmail http (without tls)")
|
||||||
golog.Print("https://localhost:1443/admin/ - admin https (password moxadmin)")
|
golog.Print("https://localhost:1443/admin/ - admin https (password moxadmin)")
|
||||||
golog.Print(" http://localhost:1080/admin/ - admin http (without tls)")
|
golog.Print(" http://localhost:1080/admin/ - admin http (without tls)")
|
||||||
golog.Print("")
|
golog.Print("")
|
||||||
|
@ -332,6 +338,12 @@ func writeLocalConfig(log mlog.Log, dir, ip string) (rerr error) {
|
||||||
local.WebmailHTTPS.Enabled = true
|
local.WebmailHTTPS.Enabled = true
|
||||||
local.WebmailHTTPS.Port = 1443
|
local.WebmailHTTPS.Port = 1443
|
||||||
local.WebmailHTTPS.Path = "/webmail/"
|
local.WebmailHTTPS.Path = "/webmail/"
|
||||||
|
local.WebAPIHTTP.Enabled = true
|
||||||
|
local.WebAPIHTTP.Port = 1080
|
||||||
|
local.WebAPIHTTP.Path = "/webapi/"
|
||||||
|
local.WebAPIHTTPS.Enabled = true
|
||||||
|
local.WebAPIHTTPS.Port = 1443
|
||||||
|
local.WebAPIHTTPS.Path = "/webapi/"
|
||||||
local.AdminHTTP.Enabled = true
|
local.AdminHTTP.Enabled = true
|
||||||
local.AdminHTTP.Port = 1080
|
local.AdminHTTP.Port = 1080
|
||||||
local.AdminHTTPS.Enabled = true
|
local.AdminHTTPS.Enabled = true
|
||||||
|
@ -375,7 +387,9 @@ func writeLocalConfig(log mlog.Log, dir, ip string) (rerr error) {
|
||||||
|
|
||||||
// Write domains.conf.
|
// Write domains.conf.
|
||||||
acc := config.Account{
|
acc := config.Account{
|
||||||
RejectsMailbox: "Rejects",
|
KeepRetiredMessagePeriod: 72 * time.Hour,
|
||||||
|
KeepRetiredWebhookPeriod: 72 * time.Hour,
|
||||||
|
RejectsMailbox: "Rejects",
|
||||||
Destinations: map[string]config.Destination{
|
Destinations: map[string]config.Destination{
|
||||||
"mox@localhost": {},
|
"mox@localhost": {},
|
||||||
},
|
},
|
||||||
|
|
279
main.go
279
main.go
|
@ -1,6 +1,7 @@
|
||||||
package main
|
package main
|
||||||
|
|
||||||
import (
|
import (
|
||||||
|
"bufio"
|
||||||
"bytes"
|
"bytes"
|
||||||
"context"
|
"context"
|
||||||
"crypto"
|
"crypto"
|
||||||
|
@ -23,9 +24,11 @@ import (
|
||||||
"log"
|
"log"
|
||||||
"log/slog"
|
"log/slog"
|
||||||
"net"
|
"net"
|
||||||
|
"net/http"
|
||||||
"net/url"
|
"net/url"
|
||||||
"os"
|
"os"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
|
"reflect"
|
||||||
"runtime"
|
"runtime"
|
||||||
"slices"
|
"slices"
|
||||||
"strconv"
|
"strconv"
|
||||||
|
@ -57,6 +60,7 @@ import (
|
||||||
"github.com/mjl-/mox/moxvar"
|
"github.com/mjl-/mox/moxvar"
|
||||||
"github.com/mjl-/mox/mtasts"
|
"github.com/mjl-/mox/mtasts"
|
||||||
"github.com/mjl-/mox/publicsuffix"
|
"github.com/mjl-/mox/publicsuffix"
|
||||||
|
"github.com/mjl-/mox/queue"
|
||||||
"github.com/mjl-/mox/smtp"
|
"github.com/mjl-/mox/smtp"
|
||||||
"github.com/mjl-/mox/smtpclient"
|
"github.com/mjl-/mox/smtpclient"
|
||||||
"github.com/mjl-/mox/spf"
|
"github.com/mjl-/mox/spf"
|
||||||
|
@ -65,6 +69,7 @@ import (
|
||||||
"github.com/mjl-/mox/tlsrptdb"
|
"github.com/mjl-/mox/tlsrptdb"
|
||||||
"github.com/mjl-/mox/updates"
|
"github.com/mjl-/mox/updates"
|
||||||
"github.com/mjl-/mox/webadmin"
|
"github.com/mjl-/mox/webadmin"
|
||||||
|
"github.com/mjl-/mox/webapi"
|
||||||
)
|
)
|
||||||
|
|
||||||
var (
|
var (
|
||||||
|
@ -111,6 +116,18 @@ var commands = []struct {
|
||||||
{"queue fail", cmdQueueFail},
|
{"queue fail", cmdQueueFail},
|
||||||
{"queue drop", cmdQueueDrop},
|
{"queue drop", cmdQueueDrop},
|
||||||
{"queue dump", cmdQueueDump},
|
{"queue dump", cmdQueueDump},
|
||||||
|
{"queue retired list", cmdQueueRetiredList},
|
||||||
|
{"queue retired print", cmdQueueRetiredPrint},
|
||||||
|
{"queue suppress list", cmdQueueSuppressList},
|
||||||
|
{"queue suppress add", cmdQueueSuppressAdd},
|
||||||
|
{"queue suppress remove", cmdQueueSuppressRemove},
|
||||||
|
{"queue suppress lookup", cmdQueueSuppressLookup},
|
||||||
|
{"queue webhook list", cmdQueueHookList},
|
||||||
|
{"queue webhook schedule", cmdQueueHookSchedule},
|
||||||
|
{"queue webhook cancel", cmdQueueHookCancel},
|
||||||
|
{"queue webhook print", cmdQueueHookPrint},
|
||||||
|
{"queue webhook retired list", cmdQueueHookRetiredList},
|
||||||
|
{"queue webhook retired print", cmdQueueHookRetiredPrint},
|
||||||
{"import maildir", cmdImportMaildir},
|
{"import maildir", cmdImportMaildir},
|
||||||
{"import mbox", cmdImportMbox},
|
{"import mbox", cmdImportMbox},
|
||||||
{"export maildir", cmdExportMaildir},
|
{"export maildir", cmdExportMaildir},
|
||||||
|
@ -134,7 +151,7 @@ var commands = []struct {
|
||||||
{"config describe-sendmail", cmdConfigDescribeSendmail},
|
{"config describe-sendmail", cmdConfigDescribeSendmail},
|
||||||
{"config printservice", cmdConfigPrintservice},
|
{"config printservice", cmdConfigPrintservice},
|
||||||
{"config ensureacmehostprivatekeys", cmdConfigEnsureACMEHostprivatekeys},
|
{"config ensureacmehostprivatekeys", cmdConfigEnsureACMEHostprivatekeys},
|
||||||
{"example", cmdExample},
|
{"config example", cmdConfigExample},
|
||||||
|
|
||||||
{"checkupdate", cmdCheckupdate},
|
{"checkupdate", cmdCheckupdate},
|
||||||
{"cid", cmdCid},
|
{"cid", cmdCid},
|
||||||
|
@ -166,7 +183,9 @@ var commands = []struct {
|
||||||
{"tlsrpt lookup", cmdTLSRPTLookup},
|
{"tlsrpt lookup", cmdTLSRPTLookup},
|
||||||
{"tlsrpt parsereportmsg", cmdTLSRPTParsereportmsg},
|
{"tlsrpt parsereportmsg", cmdTLSRPTParsereportmsg},
|
||||||
{"version", cmdVersion},
|
{"version", cmdVersion},
|
||||||
|
{"webapi", cmdWebapi},
|
||||||
|
|
||||||
|
{"example", cmdExample},
|
||||||
{"bumpuidvalidity", cmdBumpUIDValidity},
|
{"bumpuidvalidity", cmdBumpUIDValidity},
|
||||||
{"reassignuids", cmdReassignUIDs},
|
{"reassignuids", cmdReassignUIDs},
|
||||||
{"fixuidmeta", cmdFixUIDMeta},
|
{"fixuidmeta", cmdFixUIDMeta},
|
||||||
|
@ -196,6 +215,7 @@ var commands = []struct {
|
||||||
{"ximport mbox", cmdXImportMbox},
|
{"ximport mbox", cmdXImportMbox},
|
||||||
{"openaccounts", cmdOpenaccounts},
|
{"openaccounts", cmdOpenaccounts},
|
||||||
{"readmessages", cmdReadmessages},
|
{"readmessages", cmdReadmessages},
|
||||||
|
{"queuefillretired", cmdQueueFillRetired},
|
||||||
}
|
}
|
||||||
|
|
||||||
var cmds []cmd
|
var cmds []cmd
|
||||||
|
@ -2384,6 +2404,7 @@ The report is printed in formatted JSON.
|
||||||
// todo future: only print the highlights?
|
// todo future: only print the highlights?
|
||||||
enc := json.NewEncoder(os.Stdout)
|
enc := json.NewEncoder(os.Stdout)
|
||||||
enc.SetIndent("", "\t")
|
enc.SetIndent("", "\t")
|
||||||
|
enc.SetEscapeHTML(false)
|
||||||
err = enc.Encode(reportJSON)
|
err = enc.Encode(reportJSON)
|
||||||
xcheckf(err, "write report")
|
xcheckf(err, "write report")
|
||||||
}
|
}
|
||||||
|
@ -2661,6 +2682,97 @@ func cmdVersion(c *cmd) {
|
||||||
fmt.Printf("%s %s/%s\n", runtime.Version(), runtime.GOOS, runtime.GOARCH)
|
fmt.Printf("%s %s/%s\n", runtime.Version(), runtime.GOOS, runtime.GOARCH)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func cmdWebapi(c *cmd) {
|
||||||
|
c.params = "[method [baseurl-with-credentials]"
|
||||||
|
c.help = "Lists available methods, prints request/response parameters for method, or calls a method with a request read from standard input."
|
||||||
|
args := c.Parse()
|
||||||
|
if len(args) > 2 {
|
||||||
|
c.Usage()
|
||||||
|
}
|
||||||
|
|
||||||
|
t := reflect.TypeOf((*webapi.Methods)(nil)).Elem()
|
||||||
|
methods := map[string]reflect.Type{}
|
||||||
|
var ml []string
|
||||||
|
for i := 0; i < t.NumMethod(); i++ {
|
||||||
|
mt := t.Method(i)
|
||||||
|
methods[mt.Name] = mt.Type
|
||||||
|
ml = append(ml, mt.Name)
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(args) == 0 {
|
||||||
|
fmt.Println(strings.Join(ml, "\n"))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
mt, ok := methods[args[0]]
|
||||||
|
if !ok {
|
||||||
|
log.Fatalf("unknown method %q", args[0])
|
||||||
|
}
|
||||||
|
resultNotJSON := mt.Out(0).Kind() == reflect.Interface
|
||||||
|
|
||||||
|
if len(args) == 1 {
|
||||||
|
fmt.Println("# Example request")
|
||||||
|
fmt.Println()
|
||||||
|
printJSON("\t", mox.FillExample(nil, reflect.New(mt.In(1))).Interface())
|
||||||
|
fmt.Println()
|
||||||
|
if resultNotJSON {
|
||||||
|
fmt.Println("Output is non-JSON data.")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
fmt.Println("# Example response")
|
||||||
|
fmt.Println()
|
||||||
|
printJSON("\t", mox.FillExample(nil, reflect.New(mt.Out(0))).Interface())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
var response any
|
||||||
|
if !resultNotJSON {
|
||||||
|
response = reflect.New(mt.Out(0))
|
||||||
|
}
|
||||||
|
|
||||||
|
fmt.Fprintln(os.Stderr, "reading request from stdin...")
|
||||||
|
request, err := io.ReadAll(os.Stdin)
|
||||||
|
xcheckf(err, "read message")
|
||||||
|
|
||||||
|
dec := json.NewDecoder(bytes.NewReader(request))
|
||||||
|
dec.DisallowUnknownFields()
|
||||||
|
err = dec.Decode(reflect.New(mt.In(1)).Interface())
|
||||||
|
xcheckf(err, "parsing request")
|
||||||
|
|
||||||
|
resp, err := http.PostForm(args[1]+args[0], url.Values{"request": []string{string(request)}})
|
||||||
|
xcheckf(err, "http post")
|
||||||
|
defer resp.Body.Close()
|
||||||
|
if resp.StatusCode == http.StatusBadRequest {
|
||||||
|
buf, err := io.ReadAll(&moxio.LimitReader{R: resp.Body, Limit: 10 * 1024})
|
||||||
|
xcheckf(err, "reading response for 400 bad request error")
|
||||||
|
err = json.Unmarshal(buf, &response)
|
||||||
|
if err == nil {
|
||||||
|
printJSON("", response)
|
||||||
|
} else {
|
||||||
|
fmt.Fprintf(os.Stderr, "(not json)\n")
|
||||||
|
os.Stderr.Write(buf)
|
||||||
|
}
|
||||||
|
os.Exit(1)
|
||||||
|
} else if resp.StatusCode != http.StatusOK {
|
||||||
|
fmt.Fprintf(os.Stderr, "http response %s\n", resp.Status)
|
||||||
|
_, err := io.Copy(os.Stderr, resp.Body)
|
||||||
|
xcheckf(err, "copy body")
|
||||||
|
} else {
|
||||||
|
err := json.NewDecoder(resp.Body).Decode(&resp)
|
||||||
|
xcheckf(err, "unmarshal response")
|
||||||
|
printJSON("", response)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func printJSON(indent string, v any) {
|
||||||
|
fmt.Printf("%s", indent)
|
||||||
|
enc := json.NewEncoder(os.Stdout)
|
||||||
|
enc.SetIndent(indent, "\t")
|
||||||
|
enc.SetEscapeHTML(false)
|
||||||
|
err := enc.Encode(v)
|
||||||
|
xcheckf(err, "encode json")
|
||||||
|
}
|
||||||
|
|
||||||
// todo: should make it possible to run this command against a running mox. it should disconnect existing clients for accounts with a bumped uidvalidity, so they will reconnect and refetch the data.
|
// todo: should make it possible to run this command against a running mox. it should disconnect existing clients for accounts with a bumped uidvalidity, so they will reconnect and refetch the data.
|
||||||
func cmdBumpUIDValidity(c *cmd) {
|
func cmdBumpUIDValidity(c *cmd) {
|
||||||
c.params = "account [mailbox]"
|
c.params = "account [mailbox]"
|
||||||
|
@ -3020,6 +3132,8 @@ func cmdMessageParse(c *cmd) {
|
||||||
c.params = "message.eml"
|
c.params = "message.eml"
|
||||||
c.help = "Parse message, print JSON representation."
|
c.help = "Parse message, print JSON representation."
|
||||||
|
|
||||||
|
var smtputf8 bool
|
||||||
|
c.flag.BoolVar(&smtputf8, "smtputf8", false, "check if message needs smtputf8")
|
||||||
args := c.Parse()
|
args := c.Parse()
|
||||||
if len(args) != 1 {
|
if len(args) != 1 {
|
||||||
c.Usage()
|
c.Usage()
|
||||||
|
@ -3035,8 +3149,40 @@ func cmdMessageParse(c *cmd) {
|
||||||
xcheckf(err, "parsing nested parts")
|
xcheckf(err, "parsing nested parts")
|
||||||
enc := json.NewEncoder(os.Stdout)
|
enc := json.NewEncoder(os.Stdout)
|
||||||
enc.SetIndent("", "\t")
|
enc.SetIndent("", "\t")
|
||||||
|
enc.SetEscapeHTML(false)
|
||||||
err = enc.Encode(part)
|
err = enc.Encode(part)
|
||||||
xcheckf(err, "write")
|
xcheckf(err, "write")
|
||||||
|
|
||||||
|
hasNonASCII := func(r io.Reader) bool {
|
||||||
|
br := bufio.NewReader(r)
|
||||||
|
for {
|
||||||
|
b, err := br.ReadByte()
|
||||||
|
if err == io.EOF {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
xcheckf(err, "read header")
|
||||||
|
if b > 0x7f {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
var walk func(p *message.Part) bool
|
||||||
|
walk = func(p *message.Part) bool {
|
||||||
|
if hasNonASCII(p.HeaderReader()) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
for _, pp := range p.Parts {
|
||||||
|
if walk(&pp) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
if smtputf8 {
|
||||||
|
fmt.Println("message needs smtputf8:", walk(&part))
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func cmdOpenaccounts(c *cmd) {
|
func cmdOpenaccounts(c *cmd) {
|
||||||
|
@ -3240,3 +3386,134 @@ Opens database files directly, not going through a running mox instance.
|
||||||
log.Printf("account %s, total time %s", accName, time.Since(t0))
|
log.Printf("account %s, total time %s", accName, time.Since(t0))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func cmdQueueFillRetired(c *cmd) {
|
||||||
|
c.unlisted = true
|
||||||
|
c.help = `Fill retired messag and webhooks queue with testdata.
|
||||||
|
|
||||||
|
For testing the pagination. Operates directly on queue database.
|
||||||
|
`
|
||||||
|
var n int
|
||||||
|
c.flag.IntVar(&n, "n", 10000, "retired messages and retired webhooks to insert")
|
||||||
|
args := c.Parse()
|
||||||
|
if len(args) != 0 {
|
||||||
|
c.Usage()
|
||||||
|
}
|
||||||
|
|
||||||
|
mustLoadConfig()
|
||||||
|
err := queue.Init()
|
||||||
|
xcheckf(err, "init queue")
|
||||||
|
err = queue.DB.Write(context.Background(), func(tx *bstore.Tx) error {
|
||||||
|
now := time.Now()
|
||||||
|
|
||||||
|
// Cause autoincrement ID for queue.Msg to be forwarded, and use the reserved ID
|
||||||
|
// space for inserting retired messages.
|
||||||
|
fm := queue.Msg{}
|
||||||
|
err = tx.Insert(&fm)
|
||||||
|
xcheckf(err, "temporarily insert message to get autoincrement sequence")
|
||||||
|
err = tx.Delete(&fm)
|
||||||
|
xcheckf(err, "removing temporary message for resetting autoincrement sequence")
|
||||||
|
fm.ID += int64(n)
|
||||||
|
err = tx.Insert(&fm)
|
||||||
|
xcheckf(err, "temporarily insert message to forward autoincrement sequence")
|
||||||
|
err = tx.Delete(&fm)
|
||||||
|
xcheckf(err, "removing temporary message after forwarding autoincrement sequence")
|
||||||
|
fm.ID -= int64(n)
|
||||||
|
|
||||||
|
// And likewise for webhooks.
|
||||||
|
fh := queue.Hook{Account: "x", URL: "x", NextAttempt: time.Now()}
|
||||||
|
err = tx.Insert(&fh)
|
||||||
|
xcheckf(err, "temporarily insert webhook to get autoincrement sequence")
|
||||||
|
err = tx.Delete(&fh)
|
||||||
|
xcheckf(err, "removing temporary webhook for resetting autoincrement sequence")
|
||||||
|
fh.ID += int64(n)
|
||||||
|
err = tx.Insert(&fh)
|
||||||
|
xcheckf(err, "temporarily insert webhook to forward autoincrement sequence")
|
||||||
|
err = tx.Delete(&fh)
|
||||||
|
xcheckf(err, "removing temporary webhook after forwarding autoincrement sequence")
|
||||||
|
fh.ID -= int64(n)
|
||||||
|
|
||||||
|
for i := 0; i < n; i++ {
|
||||||
|
t0 := now.Add(-time.Duration(i) * time.Second)
|
||||||
|
last := now.Add(-time.Duration(i/10) * time.Second)
|
||||||
|
mr := queue.MsgRetired{
|
||||||
|
ID: fm.ID + int64(i),
|
||||||
|
Queued: t0,
|
||||||
|
SenderAccount: "test",
|
||||||
|
SenderLocalpart: "mox",
|
||||||
|
SenderDomainStr: "localhost",
|
||||||
|
FromID: fmt.Sprintf("%016d", i),
|
||||||
|
RecipientLocalpart: "mox",
|
||||||
|
RecipientDomain: dns.IPDomain{Domain: dns.Domain{ASCII: "localhost"}},
|
||||||
|
RecipientDomainStr: "localhost",
|
||||||
|
Attempts: i % 6,
|
||||||
|
LastAttempt: &last,
|
||||||
|
Results: []queue.MsgResult{
|
||||||
|
{
|
||||||
|
Start: last,
|
||||||
|
Duration: time.Millisecond,
|
||||||
|
Success: i%10 != 0,
|
||||||
|
Code: 250,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
Has8bit: i%2 == 0,
|
||||||
|
SMTPUTF8: i%8 == 0,
|
||||||
|
Size: int64(i * 100),
|
||||||
|
MessageID: fmt.Sprintf("<msg%d@localhost>", i),
|
||||||
|
Subject: fmt.Sprintf("test message %d", i),
|
||||||
|
Extra: map[string]string{"i": fmt.Sprintf("%d", i)},
|
||||||
|
LastActivity: last,
|
||||||
|
RecipientAddress: "mox@localhost",
|
||||||
|
Success: i%10 != 0,
|
||||||
|
KeepUntil: now.Add(48 * time.Hour),
|
||||||
|
}
|
||||||
|
err := tx.Insert(&mr)
|
||||||
|
xcheckf(err, "inserting retired message")
|
||||||
|
}
|
||||||
|
|
||||||
|
for i := 0; i < n; i++ {
|
||||||
|
t0 := now.Add(-time.Duration(i) * time.Second)
|
||||||
|
last := now.Add(-time.Duration(i/10) * time.Second)
|
||||||
|
var event string
|
||||||
|
if i%10 != 0 {
|
||||||
|
event = "delivered"
|
||||||
|
}
|
||||||
|
hr := queue.HookRetired{
|
||||||
|
ID: fh.ID + int64(i),
|
||||||
|
QueueMsgID: fm.ID + int64(i),
|
||||||
|
FromID: fmt.Sprintf("%016d", i),
|
||||||
|
MessageID: fmt.Sprintf("<msg%d@localhost>", i),
|
||||||
|
Subject: fmt.Sprintf("test message %d", i),
|
||||||
|
Extra: map[string]string{"i": fmt.Sprintf("%d", i)},
|
||||||
|
Account: "test",
|
||||||
|
URL: "http://localhost/hook",
|
||||||
|
IsIncoming: i%10 == 0,
|
||||||
|
OutgoingEvent: event,
|
||||||
|
Payload: "{}",
|
||||||
|
|
||||||
|
Submitted: t0,
|
||||||
|
Attempts: i % 6,
|
||||||
|
Results: []queue.HookResult{
|
||||||
|
{
|
||||||
|
Start: t0,
|
||||||
|
Duration: time.Millisecond,
|
||||||
|
URL: "http://localhost/hook",
|
||||||
|
Success: i%10 != 0,
|
||||||
|
Code: 200,
|
||||||
|
Response: "ok",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
|
||||||
|
Success: i%10 != 0,
|
||||||
|
LastActivity: last,
|
||||||
|
KeepUntil: now.Add(48 * time.Hour),
|
||||||
|
}
|
||||||
|
err := tx.Insert(&hr)
|
||||||
|
xcheckf(err, "inserting retired hook")
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
xcheckf(err, "add to queue")
|
||||||
|
log.Printf("added %d retired messages and %d retired webhooks", n, n)
|
||||||
|
}
|
||||||
|
|
|
@ -141,7 +141,7 @@ func (c *Composer) Line() {
|
||||||
// with newlines (lf), which are replaced with crlf. The returned text may be
|
// with newlines (lf), which are replaced with crlf. The returned text may be
|
||||||
// quotedprintable, if needed. The returned ct and cte headers are for use with
|
// quotedprintable, if needed. The returned ct and cte headers are for use with
|
||||||
// Content-Type and Content-Transfer-Encoding headers.
|
// Content-Type and Content-Transfer-Encoding headers.
|
||||||
func (c *Composer) TextPart(text string) (textBody []byte, ct, cte string) {
|
func (c *Composer) TextPart(subtype, text string) (textBody []byte, ct, cte string) {
|
||||||
if !strings.HasSuffix(text, "\n") {
|
if !strings.HasSuffix(text, "\n") {
|
||||||
text += "\n"
|
text += "\n"
|
||||||
}
|
}
|
||||||
|
@ -162,7 +162,7 @@ func (c *Composer) TextPart(text string) (textBody []byte, ct, cte string) {
|
||||||
cte = "7bit"
|
cte = "7bit"
|
||||||
}
|
}
|
||||||
|
|
||||||
ct = mime.FormatMediaType("text/plain", map[string]string{"charset": charset})
|
ct = mime.FormatMediaType("text/"+subtype, map[string]string{"charset": charset})
|
||||||
return []byte(text), ct, cte
|
return []byte(text), ct, cte
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -120,7 +120,7 @@ func ExampleComposer() {
|
||||||
xc.Header("MIME-Version", "1.0")
|
xc.Header("MIME-Version", "1.0")
|
||||||
|
|
||||||
// Write content-* headers for the text body.
|
// Write content-* headers for the text body.
|
||||||
body, ct, cte := xc.TextPart("this is the body")
|
body, ct, cte := xc.TextPart("plain", "this is the body")
|
||||||
xc.Header("Content-Type", ct)
|
xc.Header("Content-Type", ct)
|
||||||
xc.Header("Content-Transfer-Encoding", cte)
|
xc.Header("Content-Transfer-Encoding", cte)
|
||||||
|
|
||||||
|
|
|
@ -97,8 +97,8 @@ type Envelope struct {
|
||||||
To []Address
|
To []Address
|
||||||
CC []Address
|
CC []Address
|
||||||
BCC []Address
|
BCC []Address
|
||||||
InReplyTo string
|
InReplyTo string // From In-Reply-To header, includes <>.
|
||||||
MessageID string
|
MessageID string // From Message-Id header, includes <>.
|
||||||
}
|
}
|
||||||
|
|
||||||
// Address as used in From and To headers.
|
// Address as used in From and To headers.
|
||||||
|
|
|
@ -13,8 +13,8 @@ var (
|
||||||
Help: "Authentication attempts and results.",
|
Help: "Authentication attempts and results.",
|
||||||
},
|
},
|
||||||
[]string{
|
[]string{
|
||||||
"kind", // submission, imap, webmail, webaccount, webadmin (formerly httpaccount, httpadmin)
|
"kind", // submission, imap, webmail, webapi, webaccount, webadmin (formerly httpaccount, httpadmin)
|
||||||
"variant", // login, plain, scram-sha-256, scram-sha-1, cram-md5, weblogin, websessionuse. formerly: httpbasic.
|
"variant", // login, plain, scram-sha-256, scram-sha-1, cram-md5, weblogin, websessionuse, httpbasic.
|
||||||
// todo: we currently only use badcreds, but known baduser can be helpful
|
// todo: we currently only use badcreds, but known baduser can be helpful
|
||||||
"result", // ok, baduser, badpassword, badcreds, error, aborted
|
"result", // ok, baduser, badpassword, badcreds, error, aborted
|
||||||
},
|
},
|
||||||
|
|
|
@ -35,6 +35,7 @@ const (
|
||||||
Importmessages Panic = "importmessages"
|
Importmessages Panic = "importmessages"
|
||||||
Store Panic = "store"
|
Store Panic = "store"
|
||||||
Webadmin Panic = "webadmin"
|
Webadmin Panic = "webadmin"
|
||||||
|
Webapi Panic = "webapi"
|
||||||
Webmailsendevent Panic = "webmailsendevent"
|
Webmailsendevent Panic = "webmailsendevent"
|
||||||
Webmail Panic = "webmail"
|
Webmail Panic = "webmail"
|
||||||
Webmailrequest Panic = "webmailrequest"
|
Webmailrequest Panic = "webmailrequest"
|
||||||
|
|
151
mox-/admin.go
151
mox-/admin.go
|
@ -899,6 +899,7 @@ func AddressAdd(ctx context.Context, address, account string) (rerr error) {
|
||||||
}
|
}
|
||||||
|
|
||||||
// AddressRemove removes an email address and reloads the configuration.
|
// AddressRemove removes an email address and reloads the configuration.
|
||||||
|
// Address can be a catchall address for the domain of the form "@<domain>".
|
||||||
func AddressRemove(ctx context.Context, address string) (rerr error) {
|
func AddressRemove(ctx context.Context, address string) (rerr error) {
|
||||||
log := pkglog.WithContext(ctx)
|
log := pkglog.WithContext(ctx)
|
||||||
defer func() {
|
defer func() {
|
||||||
|
@ -934,6 +935,52 @@ func AddressRemove(ctx context.Context, address string) (rerr error) {
|
||||||
if !dropped {
|
if !dropped {
|
||||||
return fmt.Errorf("address not removed, likely a postmaster/reporting address")
|
return fmt.Errorf("address not removed, likely a postmaster/reporting address")
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Also remove matching address from FromIDLoginAddresses, composing a new slice.
|
||||||
|
var fromIDLoginAddresses []string
|
||||||
|
var dom dns.Domain
|
||||||
|
var pa smtp.Address // For non-catchall addresses (most).
|
||||||
|
var err error
|
||||||
|
if strings.HasPrefix(address, "@") {
|
||||||
|
dom, err = dns.ParseDomain(address[1:])
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("parsing domain for catchall address: %v", err)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
pa, err = smtp.ParseAddress(address)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("parsing address: %v", err)
|
||||||
|
}
|
||||||
|
dom = pa.Domain
|
||||||
|
}
|
||||||
|
for i, fa := range a.ParsedFromIDLoginAddresses {
|
||||||
|
if fa.Domain != dom {
|
||||||
|
// Keep for different domain.
|
||||||
|
fromIDLoginAddresses = append(fromIDLoginAddresses, a.FromIDLoginAddresses[i])
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
if strings.HasPrefix(address, "@") {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
dc, ok := Conf.Dynamic.Domains[dom.Name()]
|
||||||
|
if !ok {
|
||||||
|
return fmt.Errorf("unknown domain in fromid login address %q", fa.Pack(true))
|
||||||
|
}
|
||||||
|
flp, err := CanonicalLocalpart(fa.Localpart, dc)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("getting canonical localpart for fromid login address %q: %v", fa.Localpart, err)
|
||||||
|
}
|
||||||
|
alp, err := CanonicalLocalpart(pa.Localpart, dc)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("getting canonical part for address: %v", err)
|
||||||
|
}
|
||||||
|
if alp != flp {
|
||||||
|
// Keep for different localpart.
|
||||||
|
fromIDLoginAddresses = append(fromIDLoginAddresses, a.FromIDLoginAddresses[i])
|
||||||
|
}
|
||||||
|
}
|
||||||
|
na.FromIDLoginAddresses = fromIDLoginAddresses
|
||||||
|
|
||||||
nc := Conf.Dynamic
|
nc := Conf.Dynamic
|
||||||
nc.Accounts = map[string]config.Account{}
|
nc.Accounts = map[string]config.Account{}
|
||||||
for name, a := range Conf.Dynamic.Accounts {
|
for name, a := range Conf.Dynamic.Accounts {
|
||||||
|
@ -948,12 +995,16 @@ func AddressRemove(ctx context.Context, address string) (rerr error) {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
// AccountFullNameSave updates the full name for an account and reloads the configuration.
|
// AccountSave updates the configuration of an account. Function xmodify is called
|
||||||
func AccountFullNameSave(ctx context.Context, account, fullName string) (rerr error) {
|
// with a shallow copy of the current configuration of the account. It must not
|
||||||
|
// change referencing fields (e.g. existing slice/map/pointer), they may still be
|
||||||
|
// in use, and the change may be rolled back. Referencing values must be copied and
|
||||||
|
// replaced by the modify. The function may raise a panic for error handling.
|
||||||
|
func AccountSave(ctx context.Context, account string, xmodify func(acc *config.Account)) (rerr error) {
|
||||||
log := pkglog.WithContext(ctx)
|
log := pkglog.WithContext(ctx)
|
||||||
defer func() {
|
defer func() {
|
||||||
if rerr != nil {
|
if rerr != nil {
|
||||||
log.Errorx("saving account full name", rerr, slog.String("account", account))
|
log.Errorx("saving account fields", rerr, slog.String("account", account))
|
||||||
}
|
}
|
||||||
}()
|
}()
|
||||||
|
|
||||||
|
@ -966,6 +1017,8 @@ func AccountFullNameSave(ctx context.Context, account, fullName string) (rerr er
|
||||||
return fmt.Errorf("account not present")
|
return fmt.Errorf("account not present")
|
||||||
}
|
}
|
||||||
|
|
||||||
|
xmodify(&acc)
|
||||||
|
|
||||||
// Compose new config without modifying existing data structures. If we fail, we
|
// Compose new config without modifying existing data structures. If we fail, we
|
||||||
// leave no trace.
|
// leave no trace.
|
||||||
nc := c
|
nc := c
|
||||||
|
@ -973,100 +1026,12 @@ func AccountFullNameSave(ctx context.Context, account, fullName string) (rerr er
|
||||||
for name, a := range c.Accounts {
|
for name, a := range c.Accounts {
|
||||||
nc.Accounts[name] = a
|
nc.Accounts[name] = a
|
||||||
}
|
}
|
||||||
|
|
||||||
acc.FullName = fullName
|
|
||||||
nc.Accounts[account] = acc
|
nc.Accounts[account] = acc
|
||||||
|
|
||||||
if err := writeDynamic(ctx, log, nc); err != nil {
|
if err := writeDynamic(ctx, log, nc); err != nil {
|
||||||
return fmt.Errorf("writing domains.conf: %v", err)
|
return fmt.Errorf("writing domains.conf: %w", err)
|
||||||
}
|
}
|
||||||
log.Info("account full name saved", slog.String("account", account))
|
log.Info("account fields saved", slog.String("account", account))
|
||||||
return nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// DestinationSave updates a destination for an account and reloads the configuration.
|
|
||||||
func DestinationSave(ctx context.Context, account, destName string, newDest config.Destination) (rerr error) {
|
|
||||||
log := pkglog.WithContext(ctx)
|
|
||||||
defer func() {
|
|
||||||
if rerr != nil {
|
|
||||||
log.Errorx("saving destination", rerr,
|
|
||||||
slog.String("account", account),
|
|
||||||
slog.String("destname", destName),
|
|
||||||
slog.Any("destination", newDest))
|
|
||||||
}
|
|
||||||
}()
|
|
||||||
|
|
||||||
Conf.dynamicMutex.Lock()
|
|
||||||
defer Conf.dynamicMutex.Unlock()
|
|
||||||
|
|
||||||
c := Conf.Dynamic
|
|
||||||
acc, ok := c.Accounts[account]
|
|
||||||
if !ok {
|
|
||||||
return fmt.Errorf("account not present")
|
|
||||||
}
|
|
||||||
|
|
||||||
if _, ok := acc.Destinations[destName]; !ok {
|
|
||||||
return fmt.Errorf("destination not present")
|
|
||||||
}
|
|
||||||
|
|
||||||
// Compose new config without modifying existing data structures. If we fail, we
|
|
||||||
// leave no trace.
|
|
||||||
nc := c
|
|
||||||
nc.Accounts = map[string]config.Account{}
|
|
||||||
for name, a := range c.Accounts {
|
|
||||||
nc.Accounts[name] = a
|
|
||||||
}
|
|
||||||
nd := map[string]config.Destination{}
|
|
||||||
for dn, d := range acc.Destinations {
|
|
||||||
nd[dn] = d
|
|
||||||
}
|
|
||||||
nd[destName] = newDest
|
|
||||||
nacc := nc.Accounts[account]
|
|
||||||
nacc.Destinations = nd
|
|
||||||
nc.Accounts[account] = nacc
|
|
||||||
|
|
||||||
if err := writeDynamic(ctx, log, nc); err != nil {
|
|
||||||
return fmt.Errorf("writing domains.conf: %v", err)
|
|
||||||
}
|
|
||||||
log.Info("destination saved", slog.String("account", account), slog.String("destname", destName))
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// AccountAdminSettingsSave saves new account settings for an account only an admin can change.
|
|
||||||
func AccountAdminSettingsSave(ctx context.Context, account string, maxOutgoingMessagesPerDay, maxFirstTimeRecipientsPerDay int, quotaMessageSize int64, firstTimeSenderDelay bool) (rerr error) {
|
|
||||||
log := pkglog.WithContext(ctx)
|
|
||||||
defer func() {
|
|
||||||
if rerr != nil {
|
|
||||||
log.Errorx("saving admin account settings", rerr, slog.String("account", account))
|
|
||||||
}
|
|
||||||
}()
|
|
||||||
|
|
||||||
Conf.dynamicMutex.Lock()
|
|
||||||
defer Conf.dynamicMutex.Unlock()
|
|
||||||
|
|
||||||
c := Conf.Dynamic
|
|
||||||
acc, ok := c.Accounts[account]
|
|
||||||
if !ok {
|
|
||||||
return fmt.Errorf("account not present")
|
|
||||||
}
|
|
||||||
|
|
||||||
// Compose new config without modifying existing data structures. If we fail, we
|
|
||||||
// leave no trace.
|
|
||||||
nc := c
|
|
||||||
nc.Accounts = map[string]config.Account{}
|
|
||||||
for name, a := range c.Accounts {
|
|
||||||
nc.Accounts[name] = a
|
|
||||||
}
|
|
||||||
acc.MaxOutgoingMessagesPerDay = maxOutgoingMessagesPerDay
|
|
||||||
acc.MaxFirstTimeRecipientsPerDay = maxFirstTimeRecipientsPerDay
|
|
||||||
acc.QuotaMessageSize = quotaMessageSize
|
|
||||||
acc.NoFirstTimeSenderDelay = !firstTimeSenderDelay
|
|
||||||
nc.Accounts[account] = acc
|
|
||||||
|
|
||||||
if err := writeDynamic(ctx, log, nc); err != nil {
|
|
||||||
return fmt.Errorf("writing domains.conf: %v", err)
|
|
||||||
}
|
|
||||||
log.Info("admin account settings saved", slog.String("account", account))
|
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -61,6 +61,8 @@ var (
|
||||||
Conf = Config{Log: map[string]slog.Level{"": slog.LevelError}}
|
Conf = Config{Log: map[string]slog.Level{"": slog.LevelError}}
|
||||||
)
|
)
|
||||||
|
|
||||||
|
var ErrConfig = errors.New("config error")
|
||||||
|
|
||||||
// Config as used in the code, a processed version of what is in the config file.
|
// Config as used in the code, a processed version of what is in the config file.
|
||||||
//
|
//
|
||||||
// Use methods to lookup a domain/account/address in the dynamic configuration.
|
// Use methods to lookup a domain/account/address in the dynamic configuration.
|
||||||
|
@ -317,10 +319,11 @@ func (c *Config) allowACMEHosts(log mlog.Log, checkACMEHosts bool) {
|
||||||
// todo future: write config parsing & writing code that can read a config and remembers the exact tokens including newlines and comments, and can write back a modified file. the goal is to be able to write a config file automatically (after changing fields through the ui), but not loose comments and whitespace, to still get useful diffs for storing the config in a version control system.
|
// todo future: write config parsing & writing code that can read a config and remembers the exact tokens including newlines and comments, and can write back a modified file. the goal is to be able to write a config file automatically (after changing fields through the ui), but not loose comments and whitespace, to still get useful diffs for storing the config in a version control system.
|
||||||
|
|
||||||
// must be called with lock held.
|
// must be called with lock held.
|
||||||
|
// Returns ErrConfig if the configuration is not valid.
|
||||||
func writeDynamic(ctx context.Context, log mlog.Log, c config.Dynamic) error {
|
func writeDynamic(ctx context.Context, log mlog.Log, c config.Dynamic) error {
|
||||||
accDests, errs := prepareDynamicConfig(ctx, log, ConfigDynamicPath, Conf.Static, &c)
|
accDests, errs := prepareDynamicConfig(ctx, log, ConfigDynamicPath, Conf.Static, &c)
|
||||||
if len(errs) > 0 {
|
if len(errs) > 0 {
|
||||||
return errs[0]
|
return fmt.Errorf("%w: %v", ErrConfig, errs[0])
|
||||||
}
|
}
|
||||||
|
|
||||||
var b bytes.Buffer
|
var b bytes.Buffer
|
||||||
|
@ -1272,8 +1275,52 @@ func prepareDynamicConfig(ctx context.Context, log mlog.Log, dynamicPath string,
|
||||||
}
|
}
|
||||||
acc.NotJunkMailbox = r
|
acc.NotJunkMailbox = r
|
||||||
}
|
}
|
||||||
|
|
||||||
|
acc.ParsedFromIDLoginAddresses = make([]smtp.Address, len(acc.FromIDLoginAddresses))
|
||||||
|
for i, s := range acc.FromIDLoginAddresses {
|
||||||
|
a, err := smtp.ParseAddress(s)
|
||||||
|
if err != nil {
|
||||||
|
addErrorf("invalid fromid login address %q in account %q: %v", s, accName, err)
|
||||||
|
}
|
||||||
|
// We check later on if address belongs to account.
|
||||||
|
dom, ok := c.Domains[a.Domain.Name()]
|
||||||
|
if !ok {
|
||||||
|
addErrorf("unknown domain in fromid login address %q for account %q", s, accName)
|
||||||
|
} else if dom.LocalpartCatchallSeparator == "" {
|
||||||
|
addErrorf("localpart catchall separator not configured for domain for fromid login address %q for account %q", s, accName)
|
||||||
|
}
|
||||||
|
acc.ParsedFromIDLoginAddresses[i] = a
|
||||||
|
}
|
||||||
|
|
||||||
c.Accounts[accName] = acc
|
c.Accounts[accName] = acc
|
||||||
|
|
||||||
|
if acc.OutgoingWebhook != nil {
|
||||||
|
u, err := url.Parse(acc.OutgoingWebhook.URL)
|
||||||
|
if err == nil && (u.Scheme != "http" && u.Scheme != "https") {
|
||||||
|
err = errors.New("scheme must be http or https")
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
addErrorf("parsing outgoing hook url %q in account %q: %v", acc.OutgoingWebhook.URL, accName, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// note: outgoing hook events are in ../queue/hooks.go, ../mox-/config.go, ../queue.go and ../webapi/gendoc.sh. keep in sync.
|
||||||
|
outgoingHookEvents := []string{"delivered", "suppressed", "delayed", "failed", "relayed", "expanded", "canceled", "unrecognized"}
|
||||||
|
for _, e := range acc.OutgoingWebhook.Events {
|
||||||
|
if !slices.Contains(outgoingHookEvents, e) {
|
||||||
|
addErrorf("unknown outgoing hook event %q", e)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if acc.IncomingWebhook != nil {
|
||||||
|
u, err := url.Parse(acc.IncomingWebhook.URL)
|
||||||
|
if err == nil && (u.Scheme != "http" && u.Scheme != "https") {
|
||||||
|
err = errors.New("scheme must be http or https")
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
addErrorf("parsing incoming hook url %q in account %q: %v", acc.IncomingWebhook.URL, accName, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// todo deprecated: only localpart as keys for Destinations, we are replacing them with full addresses. if domains.conf is written, we won't have to do this again.
|
// todo deprecated: only localpart as keys for Destinations, we are replacing them with full addresses. if domains.conf is written, we won't have to do this again.
|
||||||
replaceLocalparts := map[string]string{}
|
replaceLocalparts := map[string]string{}
|
||||||
|
|
||||||
|
@ -1423,6 +1470,25 @@ func prepareDynamicConfig(ctx context.Context, log mlog.Log, dynamicPath string,
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Now that all addresses are parsed, check if all fromid login addresses match
|
||||||
|
// configured addresses.
|
||||||
|
for i, a := range acc.ParsedFromIDLoginAddresses {
|
||||||
|
// For domain catchall.
|
||||||
|
if _, ok := accDests["@"+a.Domain.Name()]; ok {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
dc := c.Domains[a.Domain.Name()]
|
||||||
|
lp, err := CanonicalLocalpart(a.Localpart, dc)
|
||||||
|
if err != nil {
|
||||||
|
addErrorf("canonicalizing localpart for fromid login address %q in account %q: %v", acc.FromIDLoginAddresses[i], accName, err)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
a.Localpart = lp
|
||||||
|
if _, ok := accDests[a.Pack(true)]; !ok {
|
||||||
|
addErrorf("fromid login address %q for account %q does not match its destination addresses", acc.FromIDLoginAddresses[i], accName)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
checkRoutes("routes for account", acc.Routes)
|
checkRoutes("routes for account", acc.Routes)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
113
mox-/fill.go
Normal file
113
mox-/fill.go
Normal file
|
@ -0,0 +1,113 @@
|
||||||
|
package mox
|
||||||
|
|
||||||
|
import (
|
||||||
|
"reflect"
|
||||||
|
)
|
||||||
|
|
||||||
|
// FillNil returns a modified value with nil maps/slices replaced with empty
|
||||||
|
// maps/slices.
|
||||||
|
func FillNil(rv reflect.Value) (nv reflect.Value, changed bool) {
|
||||||
|
switch rv.Kind() {
|
||||||
|
case reflect.Struct:
|
||||||
|
for i := 0; i < rv.NumField(); i++ {
|
||||||
|
if !rv.Type().Field(i).IsExported() {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
vv := rv.Field(i)
|
||||||
|
nvv, ch := FillNil(vv)
|
||||||
|
if ch && !rv.CanSet() {
|
||||||
|
// Make struct settable.
|
||||||
|
nrv := reflect.New(rv.Type()).Elem()
|
||||||
|
for j := 0; j < rv.NumField(); j++ {
|
||||||
|
nrv.Field(j).Set(rv.Field(j))
|
||||||
|
}
|
||||||
|
rv = nrv
|
||||||
|
vv = rv.Field(i)
|
||||||
|
}
|
||||||
|
if ch {
|
||||||
|
changed = true
|
||||||
|
vv.Set(nvv)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case reflect.Slice:
|
||||||
|
if rv.IsNil() {
|
||||||
|
return reflect.MakeSlice(rv.Type(), 0, 0), true
|
||||||
|
}
|
||||||
|
n := rv.Len()
|
||||||
|
for i := 0; i < n; i++ {
|
||||||
|
rve := rv.Index(i)
|
||||||
|
nrv, ch := FillNil(rve)
|
||||||
|
if ch {
|
||||||
|
changed = true
|
||||||
|
rve.Set(nrv)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case reflect.Map:
|
||||||
|
if rv.IsNil() {
|
||||||
|
return reflect.MakeMap(rv.Type()), true
|
||||||
|
}
|
||||||
|
i := rv.MapRange()
|
||||||
|
for i.Next() {
|
||||||
|
erv, ch := FillNil(i.Value())
|
||||||
|
if ch {
|
||||||
|
changed = true
|
||||||
|
rv.SetMapIndex(i.Key(), erv)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case reflect.Pointer:
|
||||||
|
if !rv.IsNil() {
|
||||||
|
FillNil(rv.Elem())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return rv, changed
|
||||||
|
}
|
||||||
|
|
||||||
|
// FillExample returns a modified value with nil/empty maps/slices/pointers values
|
||||||
|
// replaced with non-empty versions, for more helpful examples of types. Useful for
|
||||||
|
// documenting JSON representations of types.
|
||||||
|
func FillExample(seen []reflect.Type, rv reflect.Value) reflect.Value {
|
||||||
|
if seen == nil {
|
||||||
|
seen = make([]reflect.Type, 100)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Prevent recursive filling.
|
||||||
|
rvt := rv.Type()
|
||||||
|
index := -1
|
||||||
|
for i, t := range seen {
|
||||||
|
if t == rvt {
|
||||||
|
return rv
|
||||||
|
} else if t == nil {
|
||||||
|
index = i
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if index < 0 {
|
||||||
|
return rv
|
||||||
|
}
|
||||||
|
seen[index] = rvt
|
||||||
|
defer func() {
|
||||||
|
seen[index] = nil
|
||||||
|
}()
|
||||||
|
|
||||||
|
switch rv.Kind() {
|
||||||
|
case reflect.Struct:
|
||||||
|
for i := 0; i < rv.NumField(); i++ {
|
||||||
|
if !rvt.Field(i).IsExported() {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
vv := rv.Field(i)
|
||||||
|
vv.Set(FillExample(seen, vv))
|
||||||
|
}
|
||||||
|
case reflect.Slice:
|
||||||
|
ev := FillExample(seen, reflect.New(rvt.Elem()).Elem())
|
||||||
|
return reflect.Append(rv, ev)
|
||||||
|
case reflect.Map:
|
||||||
|
vv := FillExample(seen, reflect.New(rvt.Elem()).Elem())
|
||||||
|
nv := reflect.MakeMap(rvt)
|
||||||
|
nv.SetMapIndex(reflect.ValueOf("example"), vv)
|
||||||
|
return nv
|
||||||
|
case reflect.Pointer:
|
||||||
|
nv := reflect.New(rvt.Elem())
|
||||||
|
return FillExample(seen, nv.Elem()).Addr()
|
||||||
|
}
|
||||||
|
return rv
|
||||||
|
}
|
31
mox-/localserve.go
Normal file
31
mox-/localserve.go
Normal file
|
@ -0,0 +1,31 @@
|
||||||
|
package mox
|
||||||
|
|
||||||
|
import (
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"github.com/mjl-/mox/smtp"
|
||||||
|
)
|
||||||
|
|
||||||
|
func LocalserveNeedsError(lp smtp.Localpart) (code int, timeout bool) {
|
||||||
|
s := string(lp)
|
||||||
|
if strings.HasSuffix(s, "temperror") {
|
||||||
|
return smtp.C451LocalErr, false
|
||||||
|
} else if strings.HasSuffix(s, "permerror") {
|
||||||
|
return smtp.C550MailboxUnavail, false
|
||||||
|
} else if strings.HasSuffix(s, "timeout") {
|
||||||
|
return 0, true
|
||||||
|
}
|
||||||
|
if len(s) < 3 {
|
||||||
|
return 0, false
|
||||||
|
}
|
||||||
|
s = s[len(s)-3:]
|
||||||
|
v, err := strconv.ParseInt(s, 10, 32)
|
||||||
|
if err != nil {
|
||||||
|
return 0, false
|
||||||
|
}
|
||||||
|
if v < 400 || v > 600 {
|
||||||
|
return 0, false
|
||||||
|
}
|
||||||
|
return int(v), false
|
||||||
|
}
|
|
@ -1,5 +1,7 @@
|
||||||
package moxio
|
package moxio
|
||||||
|
|
||||||
|
// similar between ../moxio/limitreader.go and ../webapi/limitreader.go
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"errors"
|
"errors"
|
||||||
"io"
|
"io"
|
||||||
|
|
|
@ -8,12 +8,16 @@ import (
|
||||||
// Version is set at runtime based on the Go module used to build.
|
// Version is set at runtime based on the Go module used to build.
|
||||||
var Version = "(devel)"
|
var Version = "(devel)"
|
||||||
|
|
||||||
|
// VersionBare does not add a "+modifications" or other suffix to the version.
|
||||||
|
var VersionBare = "(devel)"
|
||||||
|
|
||||||
func init() {
|
func init() {
|
||||||
buildInfo, ok := debug.ReadBuildInfo()
|
buildInfo, ok := debug.ReadBuildInfo()
|
||||||
if !ok {
|
if !ok {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
Version = buildInfo.Main.Version
|
Version = buildInfo.Main.Version
|
||||||
|
VersionBare = buildInfo.Main.Version
|
||||||
if Version == "(devel)" {
|
if Version == "(devel)" {
|
||||||
var vcsRev, vcsMod string
|
var vcsRev, vcsMod string
|
||||||
for _, setting := range buildInfo.Settings {
|
for _, setting := range buildInfo.Settings {
|
||||||
|
@ -27,6 +31,7 @@ func init() {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
Version = vcsRev
|
Version = vcsRev
|
||||||
|
VersionBare = vcsRev
|
||||||
switch vcsMod {
|
switch vcsMod {
|
||||||
case "false":
|
case "false":
|
||||||
case "true":
|
case "true":
|
||||||
|
|
512
queue.go
512
queue.go
|
@ -14,6 +14,12 @@ import (
|
||||||
"github.com/mjl-/mox/queue"
|
"github.com/mjl-/mox/queue"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
func xctlwriteJSON(ctl *ctl, v any) {
|
||||||
|
fbuf, err := json.Marshal(v)
|
||||||
|
xcheckf(err, "marshal as json to ctl")
|
||||||
|
ctl.xwrite(string(fbuf))
|
||||||
|
}
|
||||||
|
|
||||||
func cmdQueueHoldrulesList(c *cmd) {
|
func cmdQueueHoldrulesList(c *cmd) {
|
||||||
c.help = `List hold rules for the delivery queue.
|
c.help = `List hold rules for the delivery queue.
|
||||||
|
|
||||||
|
@ -84,9 +90,9 @@ func ctlcmdQueueHoldrulesRemove(ctl *ctl, id int64) {
|
||||||
ctl.xreadok()
|
ctl.xreadok()
|
||||||
}
|
}
|
||||||
|
|
||||||
// flagFilter is used by many of the queue commands to accept flags for filtering
|
// flagFilterSort is used by many of the queue commands to accept flags for
|
||||||
// the messages the operation applies to.
|
// filtering the messages the operation applies to.
|
||||||
func flagFilter(fs *flag.FlagSet, f *queue.Filter) {
|
func flagFilterSort(fs *flag.FlagSet, f *queue.Filter, s *queue.Sort) {
|
||||||
fs.Func("ids", "comma-separated list of message IDs", func(v string) error {
|
fs.Func("ids", "comma-separated list of message IDs", func(v string) error {
|
||||||
for _, s := range strings.Split(v, ",") {
|
for _, s := range strings.Split(v, ",") {
|
||||||
id, err := strconv.ParseInt(s, 10, 64)
|
id, err := strconv.ParseInt(s, 10, 64)
|
||||||
|
@ -97,6 +103,7 @@ func flagFilter(fs *flag.FlagSet, f *queue.Filter) {
|
||||||
}
|
}
|
||||||
return nil
|
return nil
|
||||||
})
|
})
|
||||||
|
fs.IntVar(&f.Max, "n", 0, "number of messages to return")
|
||||||
fs.StringVar(&f.Account, "account", "", "account that queued the message")
|
fs.StringVar(&f.Account, "account", "", "account that queued the message")
|
||||||
fs.StringVar(&f.From, "from", "", `from address of message, use "@example.com" to match all messages for a domain`)
|
fs.StringVar(&f.From, "from", "", `from address of message, use "@example.com" to match all messages for a domain`)
|
||||||
fs.StringVar(&f.To, "to", "", `recipient address of message, use "@example.com" to match all messages for a domain`)
|
fs.StringVar(&f.To, "to", "", `recipient address of message, use "@example.com" to match all messages for a domain`)
|
||||||
|
@ -118,32 +125,93 @@ func flagFilter(fs *flag.FlagSet, f *queue.Filter) {
|
||||||
f.Hold = &hold
|
f.Hold = &hold
|
||||||
return nil
|
return nil
|
||||||
})
|
})
|
||||||
|
if s != nil {
|
||||||
|
fs.Func("sort", `field to sort by, "nextattempt" (default) or "queued"`, func(v string) error {
|
||||||
|
switch v {
|
||||||
|
case "nextattempt":
|
||||||
|
s.Field = "NextAttempt"
|
||||||
|
case "queued":
|
||||||
|
s.Field = "Queued"
|
||||||
|
default:
|
||||||
|
return fmt.Errorf("unknown value %q", v)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
fs.BoolVar(&s.Asc, "asc", false, "sort ascending instead of descending (default)")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// flagRetiredFilterSort has filters for retired messages.
|
||||||
|
func flagRetiredFilterSort(fs *flag.FlagSet, f *queue.RetiredFilter, s *queue.RetiredSort) {
|
||||||
|
fs.Func("ids", "comma-separated list of retired message IDs", func(v string) error {
|
||||||
|
for _, s := range strings.Split(v, ",") {
|
||||||
|
id, err := strconv.ParseInt(s, 10, 64)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
f.IDs = append(f.IDs, id)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
fs.IntVar(&f.Max, "n", 0, "number of messages to return")
|
||||||
|
fs.StringVar(&f.Account, "account", "", "account that queued the message")
|
||||||
|
fs.StringVar(&f.From, "from", "", `from address of message, use "@example.com" to match all messages for a domain`)
|
||||||
|
fs.StringVar(&f.To, "to", "", `recipient address of message, use "@example.com" to match all messages for a domain`)
|
||||||
|
fs.StringVar(&f.Submitted, "submitted", "", `filter by time of submission relative to now, value must start with "<" (before now) or ">" (after now)`)
|
||||||
|
fs.StringVar(&f.LastActivity, "lastactivity", "", `filter by time of last activity relative to now, value must start with "<" (before now) or ">" (after now)`)
|
||||||
|
fs.Func("transport", "transport to use for messages, empty string sets the default behaviour", func(v string) error {
|
||||||
|
f.Transport = &v
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
fs.Func("result", `"success" or "failure" as result of delivery`, func(v string) error {
|
||||||
|
switch v {
|
||||||
|
case "success":
|
||||||
|
t := true
|
||||||
|
f.Success = &t
|
||||||
|
case "failure":
|
||||||
|
t := false
|
||||||
|
f.Success = &t
|
||||||
|
default:
|
||||||
|
return fmt.Errorf("bad argument %q, need success or failure", v)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
if s != nil {
|
||||||
|
fs.Func("sort", `field to sort by, "lastactivity" (default) or "queued"`, func(v string) error {
|
||||||
|
switch v {
|
||||||
|
case "lastactivity":
|
||||||
|
s.Field = "LastActivity"
|
||||||
|
case "queued":
|
||||||
|
s.Field = "Queued"
|
||||||
|
default:
|
||||||
|
return fmt.Errorf("unknown value %q", v)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
fs.BoolVar(&s.Asc, "asc", false, "sort ascending instead of descending (default)")
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func cmdQueueList(c *cmd) {
|
func cmdQueueList(c *cmd) {
|
||||||
c.params = "[filterflags]"
|
c.params = "[filtersortflags]"
|
||||||
c.help = `List matching messages in the delivery queue.
|
c.help = `List matching messages in the delivery queue.
|
||||||
|
|
||||||
Prints the message with its ID, last and next delivery attempts, last error.
|
Prints the message with its ID, last and next delivery attempts, last error.
|
||||||
`
|
`
|
||||||
var f queue.Filter
|
var f queue.Filter
|
||||||
flagFilter(c.flag, &f)
|
var s queue.Sort
|
||||||
|
flagFilterSort(c.flag, &f, &s)
|
||||||
if len(c.Parse()) != 0 {
|
if len(c.Parse()) != 0 {
|
||||||
c.Usage()
|
c.Usage()
|
||||||
}
|
}
|
||||||
mustLoadConfig()
|
mustLoadConfig()
|
||||||
ctlcmdQueueList(xctl(), f)
|
ctlcmdQueueList(xctl(), f, s)
|
||||||
}
|
}
|
||||||
|
|
||||||
func xctlwritequeuefilter(ctl *ctl, f queue.Filter) {
|
func ctlcmdQueueList(ctl *ctl, f queue.Filter, s queue.Sort) {
|
||||||
fbuf, err := json.Marshal(f)
|
|
||||||
xcheckf(err, "marshal filter")
|
|
||||||
ctl.xwrite(string(fbuf))
|
|
||||||
}
|
|
||||||
|
|
||||||
func ctlcmdQueueList(ctl *ctl, f queue.Filter) {
|
|
||||||
ctl.xwrite("queuelist")
|
ctl.xwrite("queuelist")
|
||||||
xctlwritequeuefilter(ctl, f)
|
xctlwriteJSON(ctl, f)
|
||||||
|
xctlwriteJSON(ctl, s)
|
||||||
ctl.xreadok()
|
ctl.xreadok()
|
||||||
if _, err := io.Copy(os.Stdout, ctl.reader()); err != nil {
|
if _, err := io.Copy(os.Stdout, ctl.reader()); err != nil {
|
||||||
log.Fatalf("%s", err)
|
log.Fatalf("%s", err)
|
||||||
|
@ -158,7 +226,7 @@ Messages that are on hold are not delivered until marked as off hold again, or
|
||||||
otherwise handled by the admin.
|
otherwise handled by the admin.
|
||||||
`
|
`
|
||||||
var f queue.Filter
|
var f queue.Filter
|
||||||
flagFilter(c.flag, &f)
|
flagFilterSort(c.flag, &f, nil)
|
||||||
if len(c.Parse()) != 0 {
|
if len(c.Parse()) != 0 {
|
||||||
c.Usage()
|
c.Usage()
|
||||||
}
|
}
|
||||||
|
@ -174,7 +242,7 @@ Once off hold, messages can be delivered according to their current next
|
||||||
delivery attempt. See the "queue schedule" command.
|
delivery attempt. See the "queue schedule" command.
|
||||||
`
|
`
|
||||||
var f queue.Filter
|
var f queue.Filter
|
||||||
flagFilter(c.flag, &f)
|
flagFilterSort(c.flag, &f, nil)
|
||||||
if len(c.Parse()) != 0 {
|
if len(c.Parse()) != 0 {
|
||||||
c.Usage()
|
c.Usage()
|
||||||
}
|
}
|
||||||
|
@ -184,7 +252,7 @@ delivery attempt. See the "queue schedule" command.
|
||||||
|
|
||||||
func ctlcmdQueueHoldSet(ctl *ctl, f queue.Filter, hold bool) {
|
func ctlcmdQueueHoldSet(ctl *ctl, f queue.Filter, hold bool) {
|
||||||
ctl.xwrite("queueholdset")
|
ctl.xwrite("queueholdset")
|
||||||
xctlwritequeuefilter(ctl, f)
|
xctlwriteJSON(ctl, f)
|
||||||
if hold {
|
if hold {
|
||||||
ctl.xwrite("true")
|
ctl.xwrite("true")
|
||||||
} else {
|
} else {
|
||||||
|
@ -199,7 +267,7 @@ func ctlcmdQueueHoldSet(ctl *ctl, f queue.Filter, hold bool) {
|
||||||
}
|
}
|
||||||
|
|
||||||
func cmdQueueSchedule(c *cmd) {
|
func cmdQueueSchedule(c *cmd) {
|
||||||
c.params = "[filterflags] duration"
|
c.params = "[filterflags] [-now] duration"
|
||||||
c.help = `Change next delivery attempt for matching messages.
|
c.help = `Change next delivery attempt for matching messages.
|
||||||
|
|
||||||
The next delivery attempt is adjusted by the duration parameter. If the -now
|
The next delivery attempt is adjusted by the duration parameter. If the -now
|
||||||
|
@ -211,7 +279,7 @@ Schedule immediate delivery with "mox queue schedule -now 0".
|
||||||
var fromNow bool
|
var fromNow bool
|
||||||
c.flag.BoolVar(&fromNow, "now", false, "schedule for duration relative to current time instead of relative to current next delivery attempt for messages")
|
c.flag.BoolVar(&fromNow, "now", false, "schedule for duration relative to current time instead of relative to current next delivery attempt for messages")
|
||||||
var f queue.Filter
|
var f queue.Filter
|
||||||
flagFilter(c.flag, &f)
|
flagFilterSort(c.flag, &f, nil)
|
||||||
args := c.Parse()
|
args := c.Parse()
|
||||||
if len(args) != 1 {
|
if len(args) != 1 {
|
||||||
c.Usage()
|
c.Usage()
|
||||||
|
@ -224,7 +292,7 @@ Schedule immediate delivery with "mox queue schedule -now 0".
|
||||||
|
|
||||||
func ctlcmdQueueSchedule(ctl *ctl, f queue.Filter, fromNow bool, d time.Duration) {
|
func ctlcmdQueueSchedule(ctl *ctl, f queue.Filter, fromNow bool, d time.Duration) {
|
||||||
ctl.xwrite("queueschedule")
|
ctl.xwrite("queueschedule")
|
||||||
xctlwritequeuefilter(ctl, f)
|
xctlwriteJSON(ctl, f)
|
||||||
if fromNow {
|
if fromNow {
|
||||||
ctl.xwrite("yes")
|
ctl.xwrite("yes")
|
||||||
} else {
|
} else {
|
||||||
|
@ -233,7 +301,7 @@ func ctlcmdQueueSchedule(ctl *ctl, f queue.Filter, fromNow bool, d time.Duration
|
||||||
ctl.xwrite(d.String())
|
ctl.xwrite(d.String())
|
||||||
line := ctl.xread()
|
line := ctl.xread()
|
||||||
if line == "ok" {
|
if line == "ok" {
|
||||||
fmt.Printf("%s messages rescheduled\n", ctl.xread())
|
fmt.Printf("%s message(s) rescheduled\n", ctl.xread())
|
||||||
} else {
|
} else {
|
||||||
log.Fatalf("%s", line)
|
log.Fatalf("%s", line)
|
||||||
}
|
}
|
||||||
|
@ -249,7 +317,7 @@ configured transport assigned to use for delivery, e.g. using submission to
|
||||||
another mail server or with connections over a SOCKS proxy.
|
another mail server or with connections over a SOCKS proxy.
|
||||||
`
|
`
|
||||||
var f queue.Filter
|
var f queue.Filter
|
||||||
flagFilter(c.flag, &f)
|
flagFilterSort(c.flag, &f, nil)
|
||||||
args := c.Parse()
|
args := c.Parse()
|
||||||
if len(args) != 1 {
|
if len(args) != 1 {
|
||||||
c.Usage()
|
c.Usage()
|
||||||
|
@ -260,11 +328,11 @@ another mail server or with connections over a SOCKS proxy.
|
||||||
|
|
||||||
func ctlcmdQueueTransport(ctl *ctl, f queue.Filter, transport string) {
|
func ctlcmdQueueTransport(ctl *ctl, f queue.Filter, transport string) {
|
||||||
ctl.xwrite("queuetransport")
|
ctl.xwrite("queuetransport")
|
||||||
xctlwritequeuefilter(ctl, f)
|
xctlwriteJSON(ctl, f)
|
||||||
ctl.xwrite(transport)
|
ctl.xwrite(transport)
|
||||||
line := ctl.xread()
|
line := ctl.xread()
|
||||||
if line == "ok" {
|
if line == "ok" {
|
||||||
fmt.Printf("%s messages changed\n", ctl.xread())
|
fmt.Printf("%s message(s) changed\n", ctl.xread())
|
||||||
} else {
|
} else {
|
||||||
log.Fatalf("%s", line)
|
log.Fatalf("%s", line)
|
||||||
}
|
}
|
||||||
|
@ -285,7 +353,7 @@ Value "default" is the default behaviour, currently for unverified opportunistic
|
||||||
TLS.
|
TLS.
|
||||||
`
|
`
|
||||||
var f queue.Filter
|
var f queue.Filter
|
||||||
flagFilter(c.flag, &f)
|
flagFilterSort(c.flag, &f, nil)
|
||||||
args := c.Parse()
|
args := c.Parse()
|
||||||
if len(args) != 1 {
|
if len(args) != 1 {
|
||||||
c.Usage()
|
c.Usage()
|
||||||
|
@ -308,7 +376,7 @@ TLS.
|
||||||
|
|
||||||
func ctlcmdQueueRequireTLS(ctl *ctl, f queue.Filter, tlsreq *bool) {
|
func ctlcmdQueueRequireTLS(ctl *ctl, f queue.Filter, tlsreq *bool) {
|
||||||
ctl.xwrite("queuerequiretls")
|
ctl.xwrite("queuerequiretls")
|
||||||
xctlwritequeuefilter(ctl, f)
|
xctlwriteJSON(ctl, f)
|
||||||
var req string
|
var req string
|
||||||
if tlsreq == nil {
|
if tlsreq == nil {
|
||||||
req = ""
|
req = ""
|
||||||
|
@ -320,7 +388,7 @@ func ctlcmdQueueRequireTLS(ctl *ctl, f queue.Filter, tlsreq *bool) {
|
||||||
ctl.xwrite(req)
|
ctl.xwrite(req)
|
||||||
line := ctl.xread()
|
line := ctl.xread()
|
||||||
if line == "ok" {
|
if line == "ok" {
|
||||||
fmt.Printf("%s messages changed\n", ctl.xread())
|
fmt.Printf("%s message(s) changed\n", ctl.xread())
|
||||||
} else {
|
} else {
|
||||||
log.Fatalf("%s", line)
|
log.Fatalf("%s", line)
|
||||||
}
|
}
|
||||||
|
@ -335,7 +403,7 @@ delivery attempts failed. The DSN (delivery status notification) message
|
||||||
contains a line saying the message was canceled by the admin.
|
contains a line saying the message was canceled by the admin.
|
||||||
`
|
`
|
||||||
var f queue.Filter
|
var f queue.Filter
|
||||||
flagFilter(c.flag, &f)
|
flagFilterSort(c.flag, &f, nil)
|
||||||
if len(c.Parse()) != 0 {
|
if len(c.Parse()) != 0 {
|
||||||
c.Usage()
|
c.Usage()
|
||||||
}
|
}
|
||||||
|
@ -345,10 +413,10 @@ contains a line saying the message was canceled by the admin.
|
||||||
|
|
||||||
func ctlcmdQueueFail(ctl *ctl, f queue.Filter) {
|
func ctlcmdQueueFail(ctl *ctl, f queue.Filter) {
|
||||||
ctl.xwrite("queuefail")
|
ctl.xwrite("queuefail")
|
||||||
xctlwritequeuefilter(ctl, f)
|
xctlwriteJSON(ctl, f)
|
||||||
line := ctl.xread()
|
line := ctl.xread()
|
||||||
if line == "ok" {
|
if line == "ok" {
|
||||||
fmt.Printf("%s messages marked as failed\n", ctl.xread())
|
fmt.Printf("%s message(s) marked as failed\n", ctl.xread())
|
||||||
} else {
|
} else {
|
||||||
log.Fatalf("%s", line)
|
log.Fatalf("%s", line)
|
||||||
}
|
}
|
||||||
|
@ -362,7 +430,7 @@ Dangerous operation, this completely removes the message. If you want to store
|
||||||
the message, use "queue dump" before removing.
|
the message, use "queue dump" before removing.
|
||||||
`
|
`
|
||||||
var f queue.Filter
|
var f queue.Filter
|
||||||
flagFilter(c.flag, &f)
|
flagFilterSort(c.flag, &f, nil)
|
||||||
if len(c.Parse()) != 0 {
|
if len(c.Parse()) != 0 {
|
||||||
c.Usage()
|
c.Usage()
|
||||||
}
|
}
|
||||||
|
@ -372,10 +440,10 @@ the message, use "queue dump" before removing.
|
||||||
|
|
||||||
func ctlcmdQueueDrop(ctl *ctl, f queue.Filter) {
|
func ctlcmdQueueDrop(ctl *ctl, f queue.Filter) {
|
||||||
ctl.xwrite("queuedrop")
|
ctl.xwrite("queuedrop")
|
||||||
xctlwritequeuefilter(ctl, f)
|
xctlwriteJSON(ctl, f)
|
||||||
line := ctl.xread()
|
line := ctl.xread()
|
||||||
if line == "ok" {
|
if line == "ok" {
|
||||||
fmt.Printf("%s messages dropped\n", ctl.xread())
|
fmt.Printf("%s message(s) dropped\n", ctl.xread())
|
||||||
} else {
|
} else {
|
||||||
log.Fatalf("%s", line)
|
log.Fatalf("%s", line)
|
||||||
}
|
}
|
||||||
|
@ -403,3 +471,381 @@ func ctlcmdQueueDump(ctl *ctl, id string) {
|
||||||
log.Fatalf("%s", err)
|
log.Fatalf("%s", err)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func cmdQueueSuppressList(c *cmd) {
|
||||||
|
c.params = "[-account account]"
|
||||||
|
c.help = `Print addresses in suppression list.`
|
||||||
|
var account string
|
||||||
|
c.flag.StringVar(&account, "account", "", "only show suppression list for this account")
|
||||||
|
args := c.Parse()
|
||||||
|
if len(args) != 0 {
|
||||||
|
c.Usage()
|
||||||
|
}
|
||||||
|
mustLoadConfig()
|
||||||
|
ctlcmdQueueSuppressList(xctl(), account)
|
||||||
|
}
|
||||||
|
|
||||||
|
func ctlcmdQueueSuppressList(ctl *ctl, account string) {
|
||||||
|
ctl.xwrite("queuesuppresslist")
|
||||||
|
ctl.xwrite(account)
|
||||||
|
ctl.xreadok()
|
||||||
|
if _, err := io.Copy(os.Stdout, ctl.reader()); err != nil {
|
||||||
|
log.Fatalf("%s", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func cmdQueueSuppressAdd(c *cmd) {
|
||||||
|
c.params = "account address"
|
||||||
|
c.help = `Add address to suppression list for account.`
|
||||||
|
args := c.Parse()
|
||||||
|
if len(args) != 2 {
|
||||||
|
c.Usage()
|
||||||
|
}
|
||||||
|
mustLoadConfig()
|
||||||
|
ctlcmdQueueSuppressAdd(xctl(), args[0], args[1])
|
||||||
|
}
|
||||||
|
|
||||||
|
func ctlcmdQueueSuppressAdd(ctl *ctl, account, address string) {
|
||||||
|
ctl.xwrite("queuesuppressadd")
|
||||||
|
ctl.xwrite(account)
|
||||||
|
ctl.xwrite(address)
|
||||||
|
ctl.xreadok()
|
||||||
|
}
|
||||||
|
|
||||||
|
func cmdQueueSuppressRemove(c *cmd) {
|
||||||
|
c.params = "account address"
|
||||||
|
c.help = `Remove address from suppression list for account.`
|
||||||
|
args := c.Parse()
|
||||||
|
if len(args) != 2 {
|
||||||
|
c.Usage()
|
||||||
|
}
|
||||||
|
mustLoadConfig()
|
||||||
|
ctlcmdQueueSuppressRemove(xctl(), args[0], args[1])
|
||||||
|
}
|
||||||
|
|
||||||
|
func ctlcmdQueueSuppressRemove(ctl *ctl, account, address string) {
|
||||||
|
ctl.xwrite("queuesuppressremove")
|
||||||
|
ctl.xwrite(account)
|
||||||
|
ctl.xwrite(address)
|
||||||
|
ctl.xreadok()
|
||||||
|
}
|
||||||
|
|
||||||
|
func cmdQueueSuppressLookup(c *cmd) {
|
||||||
|
c.params = "[-account account] address"
|
||||||
|
c.help = `Check if address is present in suppression list, for any or specific account.`
|
||||||
|
var account string
|
||||||
|
c.flag.StringVar(&account, "account", "", "only check address in specified account")
|
||||||
|
args := c.Parse()
|
||||||
|
if len(args) != 1 {
|
||||||
|
c.Usage()
|
||||||
|
}
|
||||||
|
mustLoadConfig()
|
||||||
|
ctlcmdQueueSuppressLookup(xctl(), account, args[0])
|
||||||
|
}
|
||||||
|
|
||||||
|
func ctlcmdQueueSuppressLookup(ctl *ctl, account, address string) {
|
||||||
|
ctl.xwrite("queuesuppresslookup")
|
||||||
|
ctl.xwrite(account)
|
||||||
|
ctl.xwrite(address)
|
||||||
|
ctl.xreadok()
|
||||||
|
if _, err := io.Copy(os.Stdout, ctl.reader()); err != nil {
|
||||||
|
log.Fatalf("%s", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func cmdQueueRetiredList(c *cmd) {
|
||||||
|
c.params = "[filtersortflags]"
|
||||||
|
c.help = `List matching messages in the retired queue.
|
||||||
|
|
||||||
|
Prints messages with their ID and results.
|
||||||
|
`
|
||||||
|
var f queue.RetiredFilter
|
||||||
|
var s queue.RetiredSort
|
||||||
|
flagRetiredFilterSort(c.flag, &f, &s)
|
||||||
|
if len(c.Parse()) != 0 {
|
||||||
|
c.Usage()
|
||||||
|
}
|
||||||
|
mustLoadConfig()
|
||||||
|
ctlcmdQueueRetiredList(xctl(), f, s)
|
||||||
|
}
|
||||||
|
|
||||||
|
func ctlcmdQueueRetiredList(ctl *ctl, f queue.RetiredFilter, s queue.RetiredSort) {
|
||||||
|
ctl.xwrite("queueretiredlist")
|
||||||
|
xctlwriteJSON(ctl, f)
|
||||||
|
xctlwriteJSON(ctl, s)
|
||||||
|
ctl.xreadok()
|
||||||
|
if _, err := io.Copy(os.Stdout, ctl.reader()); err != nil {
|
||||||
|
log.Fatalf("%s", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func cmdQueueRetiredPrint(c *cmd) {
|
||||||
|
c.params = "id"
|
||||||
|
c.help = `Print a message from the retired queue.
|
||||||
|
|
||||||
|
Prints a JSON representation of the information from the retired queue.
|
||||||
|
`
|
||||||
|
args := c.Parse()
|
||||||
|
if len(args) != 1 {
|
||||||
|
c.Usage()
|
||||||
|
}
|
||||||
|
mustLoadConfig()
|
||||||
|
ctlcmdQueueRetiredPrint(xctl(), args[0])
|
||||||
|
}
|
||||||
|
|
||||||
|
func ctlcmdQueueRetiredPrint(ctl *ctl, id string) {
|
||||||
|
ctl.xwrite("queueretiredprint")
|
||||||
|
ctl.xwrite(id)
|
||||||
|
ctl.xreadok()
|
||||||
|
if _, err := io.Copy(os.Stdout, ctl.reader()); err != nil {
|
||||||
|
log.Fatalf("%s", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// note: outgoing hook events are in queue/hooks.go, mox-/config.go, queue.go and webapi/gendoc.sh. keep in sync.
|
||||||
|
|
||||||
|
// flagHookFilterSort is used by many of the queue commands to accept flags for
|
||||||
|
// filtering the webhooks the operation applies to.
|
||||||
|
func flagHookFilterSort(fs *flag.FlagSet, f *queue.HookFilter, s *queue.HookSort) {
|
||||||
|
fs.Func("ids", "comma-separated list of webhook IDs", func(v string) error {
|
||||||
|
for _, s := range strings.Split(v, ",") {
|
||||||
|
id, err := strconv.ParseInt(s, 10, 64)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
f.IDs = append(f.IDs, id)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
fs.IntVar(&f.Max, "n", 0, "number of webhooks to return")
|
||||||
|
fs.StringVar(&f.Account, "account", "", "account that queued the message/webhook")
|
||||||
|
fs.StringVar(&f.Submitted, "submitted", "", `filter by time of submission relative to now, value must start with "<" (before now) or ">" (after now)`)
|
||||||
|
fs.StringVar(&f.NextAttempt, "nextattempt", "", `filter by time of next delivery attempt relative to now, value must start with "<" (before now) or ">" (after now)`)
|
||||||
|
fs.Func("event", `event this webhook is about: incoming, delivered, suppressed, delayed, failed, relayed, expanded, canceled, unrecognized`, func(v string) error {
|
||||||
|
switch v {
|
||||||
|
case "incoming", "delivered", "suppressed", "delayed", "failed", "relayed", "expanded", "canceled", "unrecognized":
|
||||||
|
f.Event = v
|
||||||
|
default:
|
||||||
|
return fmt.Errorf("invalid parameter %q", v)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
if s != nil {
|
||||||
|
fs.Func("sort", `field to sort by, "nextattempt" (default) or "queued"`, func(v string) error {
|
||||||
|
switch v {
|
||||||
|
case "nextattempt":
|
||||||
|
s.Field = "NextAttempt"
|
||||||
|
case "queued":
|
||||||
|
s.Field = "Queued"
|
||||||
|
default:
|
||||||
|
return fmt.Errorf("unknown value %q", v)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
fs.BoolVar(&s.Asc, "asc", false, "sort ascending instead of descending (default)")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// flagHookRetiredFilterSort is used by many of the queue commands to accept flags
|
||||||
|
// for filtering the webhooks the operation applies to.
|
||||||
|
func flagHookRetiredFilterSort(fs *flag.FlagSet, f *queue.HookRetiredFilter, s *queue.HookRetiredSort) {
|
||||||
|
fs.Func("ids", "comma-separated list of retired webhook IDs", func(v string) error {
|
||||||
|
for _, s := range strings.Split(v, ",") {
|
||||||
|
id, err := strconv.ParseInt(s, 10, 64)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
f.IDs = append(f.IDs, id)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
fs.IntVar(&f.Max, "n", 0, "number of webhooks to return")
|
||||||
|
fs.StringVar(&f.Account, "account", "", "account that queued the message/webhook")
|
||||||
|
fs.StringVar(&f.Submitted, "submitted", "", `filter by time of submission relative to now, value must start with "<" (before now) or ">" (after now)`)
|
||||||
|
fs.StringVar(&f.LastActivity, "lastactivity", "", `filter by time of last activity relative to now, value must start with "<" (before now) or ">" (after now)`)
|
||||||
|
fs.Func("event", `event this webhook is about: incoming, delivered, suppressed, delayed, failed, relayed, expanded, canceled, unrecognized`, func(v string) error {
|
||||||
|
switch v {
|
||||||
|
case "incoming", "delivered", "suppressed", "delayed", "failed", "relayed", "expanded", "canceled", "unrecognized":
|
||||||
|
f.Event = v
|
||||||
|
default:
|
||||||
|
return fmt.Errorf("invalid parameter %q", v)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
if s != nil {
|
||||||
|
fs.Func("sort", `field to sort by, "lastactivity" (default) or "queued"`, func(v string) error {
|
||||||
|
switch v {
|
||||||
|
case "lastactivity":
|
||||||
|
s.Field = "LastActivity"
|
||||||
|
case "queued":
|
||||||
|
s.Field = "Queued"
|
||||||
|
default:
|
||||||
|
return fmt.Errorf("unknown value %q", v)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
fs.BoolVar(&s.Asc, "asc", false, "sort ascending instead of descending (default)")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func cmdQueueHookList(c *cmd) {
|
||||||
|
c.params = "[filtersortflags]"
|
||||||
|
c.help = `List matching webhooks in the queue.
|
||||||
|
|
||||||
|
Prints list of webhooks, their IDs and basic information.
|
||||||
|
`
|
||||||
|
var f queue.HookFilter
|
||||||
|
var s queue.HookSort
|
||||||
|
flagHookFilterSort(c.flag, &f, &s)
|
||||||
|
if len(c.Parse()) != 0 {
|
||||||
|
c.Usage()
|
||||||
|
}
|
||||||
|
mustLoadConfig()
|
||||||
|
ctlcmdQueueHookList(xctl(), f, s)
|
||||||
|
}
|
||||||
|
|
||||||
|
func ctlcmdQueueHookList(ctl *ctl, f queue.HookFilter, s queue.HookSort) {
|
||||||
|
ctl.xwrite("queuehooklist")
|
||||||
|
xctlwriteJSON(ctl, f)
|
||||||
|
xctlwriteJSON(ctl, s)
|
||||||
|
ctl.xreadok()
|
||||||
|
if _, err := io.Copy(os.Stdout, ctl.reader()); err != nil {
|
||||||
|
log.Fatalf("%s", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func cmdQueueHookSchedule(c *cmd) {
|
||||||
|
c.params = "[filterflags] duration"
|
||||||
|
c.help = `Change next delivery attempt for matching webhooks.
|
||||||
|
|
||||||
|
The next delivery attempt is adjusted by the duration parameter. If the -now
|
||||||
|
flag is set, the new delivery attempt is set to the duration added to the
|
||||||
|
current time, instead of added to the current scheduled time.
|
||||||
|
|
||||||
|
Schedule immediate delivery with "mox queue schedule -now 0".
|
||||||
|
`
|
||||||
|
var fromNow bool
|
||||||
|
c.flag.BoolVar(&fromNow, "now", false, "schedule for duration relative to current time instead of relative to current next delivery attempt for webhooks")
|
||||||
|
var f queue.HookFilter
|
||||||
|
flagHookFilterSort(c.flag, &f, nil)
|
||||||
|
args := c.Parse()
|
||||||
|
if len(args) != 1 {
|
||||||
|
c.Usage()
|
||||||
|
}
|
||||||
|
d, err := time.ParseDuration(args[0])
|
||||||
|
xcheckf(err, "parsing duration %q", args[0])
|
||||||
|
mustLoadConfig()
|
||||||
|
ctlcmdQueueHookSchedule(xctl(), f, fromNow, d)
|
||||||
|
}
|
||||||
|
|
||||||
|
func ctlcmdQueueHookSchedule(ctl *ctl, f queue.HookFilter, fromNow bool, d time.Duration) {
|
||||||
|
ctl.xwrite("queuehookschedule")
|
||||||
|
xctlwriteJSON(ctl, f)
|
||||||
|
if fromNow {
|
||||||
|
ctl.xwrite("yes")
|
||||||
|
} else {
|
||||||
|
ctl.xwrite("")
|
||||||
|
}
|
||||||
|
ctl.xwrite(d.String())
|
||||||
|
line := ctl.xread()
|
||||||
|
if line == "ok" {
|
||||||
|
fmt.Printf("%s webhook(s) rescheduled\n", ctl.xread())
|
||||||
|
} else {
|
||||||
|
log.Fatalf("%s", line)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func cmdQueueHookCancel(c *cmd) {
|
||||||
|
c.params = "[filterflags]"
|
||||||
|
c.help = `Fail delivery of matching webhooks.`
|
||||||
|
var f queue.HookFilter
|
||||||
|
flagHookFilterSort(c.flag, &f, nil)
|
||||||
|
if len(c.Parse()) != 0 {
|
||||||
|
c.Usage()
|
||||||
|
}
|
||||||
|
mustLoadConfig()
|
||||||
|
ctlcmdQueueHookCancel(xctl(), f)
|
||||||
|
}
|
||||||
|
|
||||||
|
func ctlcmdQueueHookCancel(ctl *ctl, f queue.HookFilter) {
|
||||||
|
ctl.xwrite("queuehookcancel")
|
||||||
|
xctlwriteJSON(ctl, f)
|
||||||
|
line := ctl.xread()
|
||||||
|
if line == "ok" {
|
||||||
|
fmt.Printf("%s webhook(s)s marked as canceled\n", ctl.xread())
|
||||||
|
} else {
|
||||||
|
log.Fatalf("%s", line)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func cmdQueueHookPrint(c *cmd) {
|
||||||
|
c.params = "id"
|
||||||
|
c.help = `Print details of a webhook from the queue.
|
||||||
|
|
||||||
|
The webhook is printed to stdout as JSON.
|
||||||
|
`
|
||||||
|
args := c.Parse()
|
||||||
|
if len(args) != 1 {
|
||||||
|
c.Usage()
|
||||||
|
}
|
||||||
|
mustLoadConfig()
|
||||||
|
ctlcmdQueueHookPrint(xctl(), args[0])
|
||||||
|
}
|
||||||
|
|
||||||
|
func ctlcmdQueueHookPrint(ctl *ctl, id string) {
|
||||||
|
ctl.xwrite("queuehookprint")
|
||||||
|
ctl.xwrite(id)
|
||||||
|
ctl.xreadok()
|
||||||
|
if _, err := io.Copy(os.Stdout, ctl.reader()); err != nil {
|
||||||
|
log.Fatalf("%s", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func cmdQueueHookRetiredList(c *cmd) {
|
||||||
|
c.params = "[filtersortflags]"
|
||||||
|
c.help = `List matching webhooks in the retired queue.
|
||||||
|
|
||||||
|
Prints list of retired webhooks, their IDs and basic information.
|
||||||
|
`
|
||||||
|
var f queue.HookRetiredFilter
|
||||||
|
var s queue.HookRetiredSort
|
||||||
|
flagHookRetiredFilterSort(c.flag, &f, &s)
|
||||||
|
if len(c.Parse()) != 0 {
|
||||||
|
c.Usage()
|
||||||
|
}
|
||||||
|
mustLoadConfig()
|
||||||
|
ctlcmdQueueHookRetiredList(xctl(), f, s)
|
||||||
|
}
|
||||||
|
|
||||||
|
func ctlcmdQueueHookRetiredList(ctl *ctl, f queue.HookRetiredFilter, s queue.HookRetiredSort) {
|
||||||
|
ctl.xwrite("queuehookretiredlist")
|
||||||
|
xctlwriteJSON(ctl, f)
|
||||||
|
xctlwriteJSON(ctl, s)
|
||||||
|
ctl.xreadok()
|
||||||
|
if _, err := io.Copy(os.Stdout, ctl.reader()); err != nil {
|
||||||
|
log.Fatalf("%s", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func cmdQueueHookRetiredPrint(c *cmd) {
|
||||||
|
c.params = "id"
|
||||||
|
c.help = `Print details of a webhook from the retired queue.
|
||||||
|
|
||||||
|
The retired webhook is printed to stdout as JSON.
|
||||||
|
`
|
||||||
|
args := c.Parse()
|
||||||
|
if len(args) != 1 {
|
||||||
|
c.Usage()
|
||||||
|
}
|
||||||
|
mustLoadConfig()
|
||||||
|
ctlcmdQueueHookRetiredPrint(xctl(), args[0])
|
||||||
|
}
|
||||||
|
|
||||||
|
func ctlcmdQueueHookRetiredPrint(ctl *ctl, id string) {
|
||||||
|
ctl.xwrite("queuehookretiredprint")
|
||||||
|
ctl.xwrite(id)
|
||||||
|
ctl.xreadok()
|
||||||
|
if _, err := io.Copy(os.Stdout, ctl.reader()); err != nil {
|
||||||
|
log.Fatalf("%s", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
|
@ -30,6 +30,7 @@ import (
|
||||||
"github.com/mjl-/mox/smtpclient"
|
"github.com/mjl-/mox/smtpclient"
|
||||||
"github.com/mjl-/mox/store"
|
"github.com/mjl-/mox/store"
|
||||||
"github.com/mjl-/mox/tlsrpt"
|
"github.com/mjl-/mox/tlsrpt"
|
||||||
|
"github.com/mjl-/mox/webhook"
|
||||||
)
|
)
|
||||||
|
|
||||||
// Increased each time an outgoing connection is made for direct delivery. Used by
|
// Increased each time an outgoing connection is made for direct delivery. Used by
|
||||||
|
@ -155,7 +156,7 @@ func deliverDirect(qlog mlog.Log, resolver dns.Resolver, dialer smtpclient.Diale
|
||||||
if permanent {
|
if permanent {
|
||||||
err = smtpclient.Error{Permanent: true, Err: err}
|
err = smtpclient.Error{Permanent: true, Err: err}
|
||||||
}
|
}
|
||||||
fail(ctx, qlog, msgs, m0.DialedIPs, backoff, dsn.NameIP{}, err)
|
failMsgsDB(qlog, msgs, m0.DialedIPs, backoff, dsn.NameIP{}, err)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -175,7 +176,7 @@ func deliverDirect(qlog mlog.Log, resolver dns.Resolver, dialer smtpclient.Diale
|
||||||
} else {
|
} else {
|
||||||
qlog.Infox("mtasts lookup temporary error, aborting delivery attempt", err, slog.Any("domain", origNextHop))
|
qlog.Infox("mtasts lookup temporary error, aborting delivery attempt", err, slog.Any("domain", origNextHop))
|
||||||
recipientDomainResult.Summary.TotalFailureSessionCount++
|
recipientDomainResult.Summary.TotalFailureSessionCount++
|
||||||
fail(ctx, qlog, msgs, m0.DialedIPs, backoff, dsn.NameIP{}, err)
|
failMsgsDB(qlog, msgs, m0.DialedIPs, backoff, dsn.NameIP{}, err)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -298,19 +299,39 @@ func deliverDirect(qlog mlog.Log, resolver dns.Resolver, dialer smtpclient.Diale
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
|
|
||||||
delIDs := make([]int64, len(result.delivered))
|
delMsgs := make([]Msg, len(result.delivered))
|
||||||
for i, mr := range result.delivered {
|
for i, mr := range result.delivered {
|
||||||
mqlog := nqlog.With(slog.Int64("msgid", mr.msg.ID), slog.Any("recipient", mr.msg.Recipient()))
|
mqlog := nqlog.With(slog.Int64("msgid", mr.msg.ID), slog.Any("recipient", mr.msg.Recipient()))
|
||||||
mqlog.Info("delivered from queue")
|
mqlog.Info("delivered from queue")
|
||||||
delIDs[i] = mr.msg.ID
|
mr.msg.markResult(0, "", "", true)
|
||||||
|
delMsgs[i] = *mr.msg
|
||||||
}
|
}
|
||||||
if len(delIDs) > 0 {
|
if len(delMsgs) > 0 {
|
||||||
if err := queueDelete(context.Background(), delIDs...); err != nil {
|
err := DB.Write(context.Background(), func(tx *bstore.Tx) error {
|
||||||
nqlog.Errorx("deleting messages from queue after delivery", err)
|
return retireMsgs(nqlog, tx, webhook.EventDelivered, 0, "", nil, delMsgs...)
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
nqlog.Errorx("deleting messages from queue database after delivery", err)
|
||||||
|
} else if err := removeMsgsFS(nqlog, delMsgs...); err != nil {
|
||||||
|
nqlog.Errorx("removing queued messages from file system after delivery", err)
|
||||||
}
|
}
|
||||||
|
kick()
|
||||||
}
|
}
|
||||||
for _, mr := range result.failed {
|
if len(result.failed) > 0 {
|
||||||
fail(ctx, nqlog, []*Msg{mr.msg}, m0.DialedIPs, backoff, remoteMTA, smtpclient.Error(mr.resp))
|
err := DB.Write(context.Background(), func(tx *bstore.Tx) error {
|
||||||
|
for _, mr := range result.failed {
|
||||||
|
failMsgsTx(nqlog, tx, []*Msg{mr.msg}, m0.DialedIPs, backoff, remoteMTA, smtpclient.Error(mr.resp))
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
for _, mr := range result.failed {
|
||||||
|
nqlog.Errorx("error processing delivery failure for messages", err,
|
||||||
|
slog.Int64("msgid", mr.msg.ID),
|
||||||
|
slog.Any("recipient", mr.msg.Recipient()))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
kick()
|
||||||
}
|
}
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
@ -335,11 +356,11 @@ func deliverDirect(qlog mlog.Log, resolver dns.Resolver, dialer smtpclient.Diale
|
||||||
Secode: smtp.SePol7MissingReqTLS30,
|
Secode: smtp.SePol7MissingReqTLS30,
|
||||||
Err: fmt.Errorf("destination servers do not support requiretls"),
|
Err: fmt.Errorf("destination servers do not support requiretls"),
|
||||||
}
|
}
|
||||||
fail(ctx, qlog, msgs, m0.DialedIPs, backoff, remoteMTA, err)
|
failMsgsDB(qlog, msgs, m0.DialedIPs, backoff, remoteMTA, err)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
fail(ctx, qlog, msgs, m0.DialedIPs, backoff, remoteMTA, lastErr)
|
failMsgsDB(qlog, msgs, m0.DialedIPs, backoff, remoteMTA, lastErr)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
167
queue/dsn.go
167
queue/dsn.go
|
@ -8,6 +8,7 @@ import (
|
||||||
"log/slog"
|
"log/slog"
|
||||||
"net"
|
"net"
|
||||||
"os"
|
"os"
|
||||||
|
"slices"
|
||||||
"strings"
|
"strings"
|
||||||
"time"
|
"time"
|
||||||
|
|
||||||
|
@ -24,6 +25,7 @@ import (
|
||||||
"github.com/mjl-/mox/smtp"
|
"github.com/mjl-/mox/smtp"
|
||||||
"github.com/mjl-/mox/smtpclient"
|
"github.com/mjl-/mox/smtpclient"
|
||||||
"github.com/mjl-/mox/store"
|
"github.com/mjl-/mox/store"
|
||||||
|
"github.com/mjl-/mox/webhook"
|
||||||
)
|
)
|
||||||
|
|
||||||
var (
|
var (
|
||||||
|
@ -35,8 +37,32 @@ var (
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
// todo: rename function, perhaps put some of the params in a delivery struct so we don't pass all the params all the time?
|
// failMsgsDB calls failMsgsTx with a new transaction, logging transaction errors.
|
||||||
func fail(ctx context.Context, qlog mlog.Log, msgs []*Msg, dialedIPs map[string][]net.IP, backoff time.Duration, remoteMTA dsn.NameIP, err error) {
|
func failMsgsDB(qlog mlog.Log, msgs []*Msg, dialedIPs map[string][]net.IP, backoff time.Duration, remoteMTA dsn.NameIP, err error) {
|
||||||
|
xerr := DB.Write(context.Background(), func(tx *bstore.Tx) error {
|
||||||
|
failMsgsTx(qlog, tx, msgs, dialedIPs, backoff, remoteMTA, err)
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
if xerr != nil {
|
||||||
|
for _, m := range msgs {
|
||||||
|
qlog.Errorx("error marking delivery as failed", xerr,
|
||||||
|
slog.String("delivererr", err.Error()),
|
||||||
|
slog.Int64("msgid", m.ID),
|
||||||
|
slog.Any("recipient", m.Recipient()),
|
||||||
|
slog.Duration("backoff", backoff),
|
||||||
|
slog.Time("nextattempt", m.NextAttempt))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
kick()
|
||||||
|
}
|
||||||
|
|
||||||
|
// todo: perhaps put some of the params in a delivery struct so we don't pass all the params all the time?
|
||||||
|
|
||||||
|
// failMsgsTx processes a failure to deliver msgs. If the error is permanent, a DSN
|
||||||
|
// is delivered to the sender account.
|
||||||
|
// Caller must call kick() after commiting the transaction for any (re)scheduling
|
||||||
|
// of messages and webhooks.
|
||||||
|
func failMsgsTx(qlog mlog.Log, tx *bstore.Tx, msgs []*Msg, dialedIPs map[string][]net.IP, backoff time.Duration, remoteMTA dsn.NameIP, err error) {
|
||||||
// todo future: when we implement relaying, we should be able to send DSNs to non-local users. and possibly specify a null mailfrom. ../rfc/5321:1503
|
// todo future: when we implement relaying, we should be able to send DSNs to non-local users. and possibly specify a null mailfrom. ../rfc/5321:1503
|
||||||
// todo future: when we implement relaying, and a dsn cannot be delivered, and requiretls was active, we cannot drop the message. instead deliver to local postmaster? though ../rfc/8689:383 may intend to say the dsn should be delivered without requiretls?
|
// todo future: when we implement relaying, and a dsn cannot be delivered, and requiretls was active, we cannot drop the message. instead deliver to local postmaster? though ../rfc/8689:383 may intend to say the dsn should be delivered without requiretls?
|
||||||
// todo future: when we implement smtp dsn extension, parameter RET=FULL must be disregarded for messages with REQUIRETLS. ../rfc/8689:379
|
// todo future: when we implement smtp dsn extension, parameter RET=FULL must be disregarded for messages with REQUIRETLS. ../rfc/8689:379
|
||||||
|
@ -49,6 +75,7 @@ func fail(ctx context.Context, qlog mlog.Log, msgs []*Msg, dialedIPs map[string]
|
||||||
var errmsg = err.Error()
|
var errmsg = err.Error()
|
||||||
var code int
|
var code int
|
||||||
var secodeOpt string
|
var secodeOpt string
|
||||||
|
var event webhook.OutgoingEvent
|
||||||
if errors.As(err, &cerr) {
|
if errors.As(err, &cerr) {
|
||||||
if cerr.Line != "" {
|
if cerr.Line != "" {
|
||||||
smtpLines = append([]string{cerr.Line}, cerr.MoreLines...)
|
smtpLines = append([]string{cerr.Line}, cerr.MoreLines...)
|
||||||
|
@ -69,22 +96,56 @@ func fail(ctx context.Context, qlog mlog.Log, msgs []*Msg, dialedIPs map[string]
|
||||||
}
|
}
|
||||||
|
|
||||||
if permanent || m0.MaxAttempts == 0 && m0.Attempts >= 8 || m0.MaxAttempts > 0 && m0.Attempts >= m0.MaxAttempts {
|
if permanent || m0.MaxAttempts == 0 && m0.Attempts >= 8 || m0.MaxAttempts > 0 && m0.Attempts >= m0.MaxAttempts {
|
||||||
for _, m := range msgs {
|
event = webhook.EventFailed
|
||||||
qmlog := qlog.With(slog.Int64("msgid", m.ID), slog.Any("recipient", m.Recipient()))
|
if errors.Is(err, errSuppressed) {
|
||||||
qmlog.Errorx("permanent failure delivering from queue", err)
|
event = webhook.EventSuppressed
|
||||||
deliverDSNFailure(ctx, qmlog, *m, remoteMTA, secodeOpt, errmsg, smtpLines)
|
|
||||||
}
|
}
|
||||||
if err := queueDelete(context.Background(), ids...); err != nil {
|
|
||||||
qlog.Errorx("deleting messages from queue after permanent failure", err)
|
|
||||||
}
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
// All messages should have the same DialedIPs, so we can update them all at once.
|
rmsgs := make([]Msg, len(msgs))
|
||||||
qup := bstore.QueryDB[Msg](context.Background(), DB)
|
var scl []suppressionCheck
|
||||||
qup.FilterIDs(ids)
|
for i, m := range msgs {
|
||||||
if _, xerr := qup.UpdateNonzero(Msg{LastError: errmsg, DialedIPs: dialedIPs}); err != nil {
|
rm := *m
|
||||||
qlog.Errorx("storing delivery error", xerr, slog.String("deliveryerror", errmsg))
|
rm.DialedIPs = dialedIPs
|
||||||
|
rm.markResult(code, secodeOpt, errmsg, false)
|
||||||
|
|
||||||
|
qmlog := qlog.With(slog.Int64("msgid", rm.ID), slog.Any("recipient", m.Recipient()))
|
||||||
|
qmlog.Errorx("permanent failure delivering from queue", err)
|
||||||
|
deliverDSNFailure(qmlog, rm, remoteMTA, secodeOpt, errmsg, smtpLines)
|
||||||
|
|
||||||
|
rmsgs[i] = rm
|
||||||
|
|
||||||
|
// If this was an smtp error from remote, we'll pass the failure to the
|
||||||
|
// suppression list.
|
||||||
|
if code == 0 {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
sc := suppressionCheck{
|
||||||
|
MsgID: rm.ID,
|
||||||
|
Account: rm.SenderAccount,
|
||||||
|
Recipient: rm.Recipient(),
|
||||||
|
Code: code,
|
||||||
|
Secode: secodeOpt,
|
||||||
|
Source: "queue",
|
||||||
|
}
|
||||||
|
scl = append(scl, sc)
|
||||||
|
}
|
||||||
|
var suppressedMsgIDs []int64
|
||||||
|
if len(scl) > 0 {
|
||||||
|
var err error
|
||||||
|
suppressedMsgIDs, err = suppressionProcess(qlog, tx, scl...)
|
||||||
|
if err != nil {
|
||||||
|
qlog.Errorx("processing delivery failure in suppression list", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
err := retireMsgs(qlog, tx, event, code, secodeOpt, suppressedMsgIDs, rmsgs...)
|
||||||
|
if err != nil {
|
||||||
|
qlog.Errorx("deleting queue messages from database after permanent failure", err)
|
||||||
|
} else if err := removeMsgsFS(qlog, rmsgs...); err != nil {
|
||||||
|
qlog.Errorx("remove queue messages from file system after permanent failure", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
if m0.Attempts == 5 {
|
if m0.Attempts == 5 {
|
||||||
|
@ -95,7 +156,7 @@ func fail(ctx context.Context, qlog mlog.Log, msgs []*Msg, dialedIPs map[string]
|
||||||
for _, m := range msgs {
|
for _, m := range msgs {
|
||||||
qmlog := qlog.With(slog.Int64("msgid", m.ID), slog.Any("recipient", m.Recipient()))
|
qmlog := qlog.With(slog.Int64("msgid", m.ID), slog.Any("recipient", m.Recipient()))
|
||||||
qmlog.Errorx("temporary failure delivering from queue, sending delayed dsn", err, slog.Duration("backoff", backoff))
|
qmlog.Errorx("temporary failure delivering from queue, sending delayed dsn", err, slog.Duration("backoff", backoff))
|
||||||
deliverDSNDelay(ctx, qmlog, *m, remoteMTA, secodeOpt, errmsg, smtpLines, retryUntil)
|
deliverDSNDelay(qmlog, *m, remoteMTA, secodeOpt, errmsg, smtpLines, retryUntil)
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
for _, m := range msgs {
|
for _, m := range msgs {
|
||||||
|
@ -106,9 +167,53 @@ func fail(ctx context.Context, qlog mlog.Log, msgs []*Msg, dialedIPs map[string]
|
||||||
slog.Time("nextattempt", m0.NextAttempt))
|
slog.Time("nextattempt", m0.NextAttempt))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
process := func() error {
|
||||||
|
// Update DialedIPs in message, and record the result.
|
||||||
|
qup := bstore.QueryTx[Msg](tx)
|
||||||
|
qup.FilterIDs(ids)
|
||||||
|
umsgs, err := qup.List()
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("retrieving messages for marking temporary delivery error: %v", err)
|
||||||
|
}
|
||||||
|
for _, um := range umsgs {
|
||||||
|
// All messages should have the same DialedIPs.
|
||||||
|
um.DialedIPs = dialedIPs
|
||||||
|
um.markResult(code, secodeOpt, errmsg, false)
|
||||||
|
if err := tx.Update(&um); err != nil {
|
||||||
|
return fmt.Errorf("updating message after temporary failure to deliver: %v", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// If configured, we'll queue webhooks for delivery.
|
||||||
|
accConf, ok := mox.Conf.Account(m0.SenderAccount)
|
||||||
|
if !(ok && accConf.OutgoingWebhook != nil && (len(accConf.OutgoingWebhook.Events) == 0 || slices.Contains(accConf.OutgoingWebhook.Events, string(webhook.EventDelayed)))) {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
hooks := make([]Hook, len(msgs))
|
||||||
|
for i, m := range msgs {
|
||||||
|
var err error
|
||||||
|
hooks[i], err = hookCompose(*m, accConf.OutgoingWebhook.URL, accConf.OutgoingWebhook.Authorization, webhook.EventDelayed, false, code, secodeOpt)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("composing webhook for failed delivery attempt for msg id %d: %v", m.ID, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
now := time.Now()
|
||||||
|
for i := range hooks {
|
||||||
|
if err := hookInsert(tx, &hooks[i], now, accConf.KeepRetiredWebhookPeriod); err != nil {
|
||||||
|
return fmt.Errorf("inserting webhook into queue: %v", err)
|
||||||
|
}
|
||||||
|
qlog.Debug("queueing webhook for temporary delivery errors", hooks[i].attrs()...)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
if err := process(); err != nil {
|
||||||
|
qlog.Errorx("processing temporary delivery error", err, slog.String("deliveryerror", errmsg))
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func deliverDSNFailure(ctx context.Context, log mlog.Log, m Msg, remoteMTA dsn.NameIP, secodeOpt, errmsg string, smtpLines []string) {
|
func deliverDSNFailure(log mlog.Log, m Msg, remoteMTA dsn.NameIP, secodeOpt, errmsg string, smtpLines []string) {
|
||||||
const subject = "mail delivery failed"
|
const subject = "mail delivery failed"
|
||||||
message := fmt.Sprintf(`
|
message := fmt.Sprintf(`
|
||||||
Delivery has failed permanently for your email to:
|
Delivery has failed permanently for your email to:
|
||||||
|
@ -125,10 +230,10 @@ Error during the last delivery attempt:
|
||||||
message += "\nFull SMTP response:\n\n\t" + strings.Join(smtpLines, "\n\t") + "\n"
|
message += "\nFull SMTP response:\n\n\t" + strings.Join(smtpLines, "\n\t") + "\n"
|
||||||
}
|
}
|
||||||
|
|
||||||
deliverDSN(ctx, log, m, remoteMTA, secodeOpt, errmsg, smtpLines, true, nil, subject, message)
|
deliverDSN(log, m, remoteMTA, secodeOpt, errmsg, smtpLines, true, nil, subject, message)
|
||||||
}
|
}
|
||||||
|
|
||||||
func deliverDSNDelay(ctx context.Context, log mlog.Log, m Msg, remoteMTA dsn.NameIP, secodeOpt, errmsg string, smtpLines []string, retryUntil time.Time) {
|
func deliverDSNDelay(log mlog.Log, m Msg, remoteMTA dsn.NameIP, secodeOpt, errmsg string, smtpLines []string, retryUntil time.Time) {
|
||||||
// Should not happen, but doesn't hurt to prevent sending delayed delivery
|
// Should not happen, but doesn't hurt to prevent sending delayed delivery
|
||||||
// notifications for DMARC reports. We don't want to waste postmaster attention.
|
// notifications for DMARC reports. We don't want to waste postmaster attention.
|
||||||
if m.IsDMARCReport {
|
if m.IsDMARCReport {
|
||||||
|
@ -152,14 +257,14 @@ Error during the last delivery attempt:
|
||||||
message += "\nFull SMTP response:\n\n\t" + strings.Join(smtpLines, "\n\t") + "\n"
|
message += "\nFull SMTP response:\n\n\t" + strings.Join(smtpLines, "\n\t") + "\n"
|
||||||
}
|
}
|
||||||
|
|
||||||
deliverDSN(ctx, log, m, remoteMTA, secodeOpt, errmsg, smtpLines, false, &retryUntil, subject, message)
|
deliverDSN(log, m, remoteMTA, secodeOpt, errmsg, smtpLines, false, &retryUntil, subject, message)
|
||||||
}
|
}
|
||||||
|
|
||||||
// We only queue DSNs for delivery failures for emails submitted by authenticated
|
// We only queue DSNs for delivery failures for emails submitted by authenticated
|
||||||
// users. So we are delivering to local users. ../rfc/5321:1466
|
// users. So we are delivering to local users. ../rfc/5321:1466
|
||||||
// ../rfc/5321:1494
|
// ../rfc/5321:1494
|
||||||
// ../rfc/7208:490
|
// ../rfc/7208:490
|
||||||
func deliverDSN(ctx context.Context, log mlog.Log, m Msg, remoteMTA dsn.NameIP, secodeOpt, errmsg string, smtpLines []string, permanent bool, retryUntil *time.Time, subject, textBody string) {
|
func deliverDSN(log mlog.Log, m Msg, remoteMTA dsn.NameIP, secodeOpt, errmsg string, smtpLines []string, permanent bool, retryUntil *time.Time, subject, textBody string) {
|
||||||
kind := "delayed delivery"
|
kind := "delayed delivery"
|
||||||
if permanent {
|
if permanent {
|
||||||
kind = "failure"
|
kind = "failure"
|
||||||
|
@ -203,7 +308,7 @@ func deliverDSN(ctx context.Context, log mlog.Log, m Msg, remoteMTA dsn.NameIP,
|
||||||
// ../rfc/3461:1329
|
// ../rfc/3461:1329
|
||||||
var smtpDiag string
|
var smtpDiag string
|
||||||
if len(smtpLines) > 0 {
|
if len(smtpLines) > 0 {
|
||||||
smtpDiag = "smtp; " + strings.Join(smtpLines, " ")
|
smtpDiag = strings.Join(smtpLines, " ")
|
||||||
}
|
}
|
||||||
|
|
||||||
dsnMsg := &dsn.Message{
|
dsnMsg := &dsn.Message{
|
||||||
|
@ -221,14 +326,14 @@ func deliverDSN(ctx context.Context, log mlog.Log, m Msg, remoteMTA dsn.NameIP,
|
||||||
|
|
||||||
Recipients: []dsn.Recipient{
|
Recipients: []dsn.Recipient{
|
||||||
{
|
{
|
||||||
FinalRecipient: m.Recipient(),
|
FinalRecipient: m.Recipient(),
|
||||||
Action: action,
|
Action: action,
|
||||||
Status: status,
|
Status: status,
|
||||||
StatusComment: errmsg,
|
StatusComment: errmsg,
|
||||||
RemoteMTA: remoteMTA,
|
RemoteMTA: remoteMTA,
|
||||||
DiagnosticCode: smtpDiag,
|
DiagnosticCodeSMTP: smtpDiag,
|
||||||
LastAttemptDate: *m.LastAttempt,
|
LastAttemptDate: *m.LastAttempt,
|
||||||
WillRetryUntil: retryUntil,
|
WillRetryUntil: retryUntil,
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
|
|
||||||
|
|
1240
queue/hook.go
Normal file
1240
queue/hook.go
Normal file
File diff suppressed because it is too large
Load diff
688
queue/hook_test.go
Normal file
688
queue/hook_test.go
Normal file
|
@ -0,0 +1,688 @@
|
||||||
|
package queue
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"net/http"
|
||||||
|
"net/http/httptest"
|
||||||
|
"slices"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/mjl-/bstore"
|
||||||
|
|
||||||
|
"github.com/mjl-/mox/dsn"
|
||||||
|
"github.com/mjl-/mox/message"
|
||||||
|
"github.com/mjl-/mox/smtp"
|
||||||
|
"github.com/mjl-/mox/store"
|
||||||
|
"github.com/mjl-/mox/webhook"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Test webhooks for incoming message that is not related to outgoing deliveries.
|
||||||
|
func TestHookIncoming(t *testing.T) {
|
||||||
|
acc, cleanup := setup(t)
|
||||||
|
defer cleanup()
|
||||||
|
err := Init()
|
||||||
|
tcheck(t, err, "queue init")
|
||||||
|
|
||||||
|
accret, err := store.OpenAccount(pkglog, "retired")
|
||||||
|
tcheck(t, err, "open account for retired")
|
||||||
|
defer func() {
|
||||||
|
accret.Close()
|
||||||
|
accret.CheckClosed()
|
||||||
|
}()
|
||||||
|
|
||||||
|
testIncoming := func(a *store.Account, expIn bool) {
|
||||||
|
t.Helper()
|
||||||
|
|
||||||
|
_, err := bstore.QueryDB[Hook](ctxbg, DB).Delete()
|
||||||
|
tcheck(t, err, "clean up hooks")
|
||||||
|
|
||||||
|
mr := bytes.NewReader([]byte(testmsg))
|
||||||
|
now := time.Now().Round(0)
|
||||||
|
m := store.Message{
|
||||||
|
ID: 123,
|
||||||
|
RemoteIP: "::1",
|
||||||
|
MailFrom: "sender@remote.example",
|
||||||
|
MailFromLocalpart: "sender",
|
||||||
|
MailFromDomain: "remote.example",
|
||||||
|
RcptToLocalpart: "rcpt",
|
||||||
|
RcptToDomain: "mox.example",
|
||||||
|
MsgFromLocalpart: "mjl",
|
||||||
|
MsgFromDomain: "mox.example",
|
||||||
|
MsgFromOrgDomain: "mox.example",
|
||||||
|
EHLOValidated: true,
|
||||||
|
MailFromValidated: true,
|
||||||
|
MsgFromValidated: true,
|
||||||
|
EHLOValidation: store.ValidationPass,
|
||||||
|
MailFromValidation: store.ValidationPass,
|
||||||
|
MsgFromValidation: store.ValidationDMARC,
|
||||||
|
DKIMDomains: []string{"remote.example"},
|
||||||
|
Received: now,
|
||||||
|
Size: int64(len(testmsg)),
|
||||||
|
}
|
||||||
|
part, err := message.EnsurePart(pkglog.Logger, true, mr, int64(len(testmsg)))
|
||||||
|
tcheck(t, err, "parsing message")
|
||||||
|
|
||||||
|
err = Incoming(ctxbg, pkglog, a, "<random@localhost>", m, part, "Inbox")
|
||||||
|
tcheck(t, err, "pass incoming message")
|
||||||
|
|
||||||
|
hl, err := bstore.QueryDB[Hook](ctxbg, DB).List()
|
||||||
|
tcheck(t, err, "list hooks")
|
||||||
|
if !expIn {
|
||||||
|
tcompare(t, len(hl), 0)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
tcompare(t, len(hl), 1)
|
||||||
|
h := hl[0]
|
||||||
|
tcompare(t, h.IsIncoming, true)
|
||||||
|
var in webhook.Incoming
|
||||||
|
dec := json.NewDecoder(strings.NewReader(h.Payload))
|
||||||
|
err = dec.Decode(&in)
|
||||||
|
tcheck(t, err, "decode incoming webhook")
|
||||||
|
|
||||||
|
expIncoming := webhook.Incoming{
|
||||||
|
From: []webhook.NameAddress{{Address: "mjl@mox.example"}},
|
||||||
|
To: []webhook.NameAddress{{Address: "mjl@mox.example"}},
|
||||||
|
CC: []webhook.NameAddress{},
|
||||||
|
BCC: []webhook.NameAddress{},
|
||||||
|
ReplyTo: []webhook.NameAddress{},
|
||||||
|
References: []string{},
|
||||||
|
Subject: "test",
|
||||||
|
Text: "test email\n",
|
||||||
|
|
||||||
|
Structure: webhook.PartStructure(&part),
|
||||||
|
Meta: webhook.IncomingMeta{
|
||||||
|
MsgID: m.ID,
|
||||||
|
MailFrom: m.MailFrom,
|
||||||
|
MailFromValidated: m.MailFromValidated,
|
||||||
|
MsgFromValidated: m.MsgFromValidated,
|
||||||
|
RcptTo: "rcpt@mox.example",
|
||||||
|
DKIMVerifiedDomains: []string{"remote.example"},
|
||||||
|
RemoteIP: "::1",
|
||||||
|
Received: m.Received,
|
||||||
|
MailboxName: "Inbox",
|
||||||
|
Automated: false,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
tcompare(t, in, expIncoming)
|
||||||
|
}
|
||||||
|
|
||||||
|
testIncoming(acc, false)
|
||||||
|
testIncoming(accret, true)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Test with fromid and various DSNs, and delivery.
|
||||||
|
func TestFromIDIncomingDelivery(t *testing.T) {
|
||||||
|
acc, cleanup := setup(t)
|
||||||
|
defer cleanup()
|
||||||
|
err := Init()
|
||||||
|
tcheck(t, err, "queue init")
|
||||||
|
|
||||||
|
accret, err := store.OpenAccount(pkglog, "retired")
|
||||||
|
tcheck(t, err, "open account for retired")
|
||||||
|
defer func() {
|
||||||
|
accret.Close()
|
||||||
|
accret.CheckClosed()
|
||||||
|
}()
|
||||||
|
|
||||||
|
// Account that only gets webhook calls, but no retired webhooks.
|
||||||
|
acchook, err := store.OpenAccount(pkglog, "hook")
|
||||||
|
tcheck(t, err, "open account for hook")
|
||||||
|
defer func() {
|
||||||
|
acchook.Close()
|
||||||
|
acchook.CheckClosed()
|
||||||
|
}()
|
||||||
|
|
||||||
|
addr, err := smtp.ParseAddress("mjl@mox.example")
|
||||||
|
tcheck(t, err, "parse address")
|
||||||
|
path := addr.Path()
|
||||||
|
|
||||||
|
now := time.Now().Round(0)
|
||||||
|
m := store.Message{
|
||||||
|
ID: 123,
|
||||||
|
RemoteIP: "::1",
|
||||||
|
MailFrom: "sender@remote.example",
|
||||||
|
MailFromLocalpart: "sender",
|
||||||
|
MailFromDomain: "remote.example",
|
||||||
|
RcptToLocalpart: "rcpt",
|
||||||
|
RcptToDomain: "mox.example",
|
||||||
|
MsgFromLocalpart: "mjl",
|
||||||
|
MsgFromDomain: "mox.example",
|
||||||
|
MsgFromOrgDomain: "mox.example",
|
||||||
|
EHLOValidated: true,
|
||||||
|
MailFromValidated: true,
|
||||||
|
MsgFromValidated: true,
|
||||||
|
EHLOValidation: store.ValidationPass,
|
||||||
|
MailFromValidation: store.ValidationPass,
|
||||||
|
MsgFromValidation: store.ValidationDMARC,
|
||||||
|
DKIMDomains: []string{"remote.example"},
|
||||||
|
Received: now,
|
||||||
|
DSN: true,
|
||||||
|
}
|
||||||
|
|
||||||
|
testIncoming := func(a *store.Account, rawmsg []byte, retiredFromID string, expIn bool, expOut *webhook.Outgoing) {
|
||||||
|
t.Helper()
|
||||||
|
|
||||||
|
_, err := bstore.QueryDB[Hook](ctxbg, DB).Delete()
|
||||||
|
tcheck(t, err, "clean up hooks")
|
||||||
|
_, err = bstore.QueryDB[MsgRetired](ctxbg, DB).Delete()
|
||||||
|
tcheck(t, err, "clean up retired messages")
|
||||||
|
|
||||||
|
qmr := MsgRetired{
|
||||||
|
SenderAccount: a.Name,
|
||||||
|
SenderLocalpart: "sender",
|
||||||
|
SenderDomainStr: "remote.example",
|
||||||
|
RecipientLocalpart: "rcpt",
|
||||||
|
RecipientDomain: path.IPDomain,
|
||||||
|
RecipientDomainStr: "mox.example",
|
||||||
|
RecipientAddress: "rcpt@mox.example",
|
||||||
|
Success: true,
|
||||||
|
KeepUntil: now.Add(time.Minute),
|
||||||
|
}
|
||||||
|
m.RcptToLocalpart = "mjl"
|
||||||
|
qmr.FromID = retiredFromID
|
||||||
|
m.Size = int64(len(rawmsg))
|
||||||
|
m.RcptToLocalpart += smtp.Localpart("+unique")
|
||||||
|
|
||||||
|
err = DB.Insert(ctxbg, &qmr)
|
||||||
|
tcheck(t, err, "insert retired message to match")
|
||||||
|
|
||||||
|
if expOut != nil {
|
||||||
|
expOut.QueueMsgID = qmr.ID
|
||||||
|
}
|
||||||
|
|
||||||
|
mr := bytes.NewReader(rawmsg)
|
||||||
|
part, err := message.EnsurePart(pkglog.Logger, true, mr, int64(len(rawmsg)))
|
||||||
|
tcheck(t, err, "parsing message")
|
||||||
|
|
||||||
|
err = Incoming(ctxbg, pkglog, a, "<random@localhost>", m, part, "Inbox")
|
||||||
|
tcheck(t, err, "pass incoming message")
|
||||||
|
|
||||||
|
hl, err := bstore.QueryDB[Hook](ctxbg, DB).List()
|
||||||
|
tcheck(t, err, "list hooks")
|
||||||
|
if !expIn && expOut == nil {
|
||||||
|
tcompare(t, len(hl), 0)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
tcompare(t, len(hl), 1)
|
||||||
|
h := hl[0]
|
||||||
|
tcompare(t, h.IsIncoming, expIn)
|
||||||
|
if expIn {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
var out webhook.Outgoing
|
||||||
|
dec := json.NewDecoder(strings.NewReader(h.Payload))
|
||||||
|
err = dec.Decode(&out)
|
||||||
|
tcheck(t, err, "decode outgoing webhook")
|
||||||
|
|
||||||
|
out.WebhookQueued = time.Time{}
|
||||||
|
tcompare(t, &out, expOut)
|
||||||
|
}
|
||||||
|
|
||||||
|
dsncompose := func(m *dsn.Message) []byte {
|
||||||
|
buf, err := m.Compose(pkglog, false)
|
||||||
|
tcheck(t, err, "compose dsn")
|
||||||
|
return buf
|
||||||
|
}
|
||||||
|
makedsn := func(action dsn.Action) *dsn.Message {
|
||||||
|
return &dsn.Message{
|
||||||
|
From: path,
|
||||||
|
To: path,
|
||||||
|
TextBody: "explanation",
|
||||||
|
MessageID: "<dsnmsgid@localhost>",
|
||||||
|
ReportingMTA: "localhost",
|
||||||
|
Recipients: []dsn.Recipient{
|
||||||
|
{
|
||||||
|
FinalRecipient: path,
|
||||||
|
Action: action,
|
||||||
|
Status: "5.0.0.",
|
||||||
|
DiagnosticCodeSMTP: "554 5.0.0 error",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
msgfailed := dsncompose(makedsn(dsn.Failed))
|
||||||
|
|
||||||
|
// No FromID to match against, so we get a webhook for a new incoming message.
|
||||||
|
testIncoming(acc, msgfailed, "", false, nil)
|
||||||
|
testIncoming(accret, msgfailed, "mismatch", true, nil)
|
||||||
|
|
||||||
|
// DSN with multiple recipients are treated as unrecognized dsns.
|
||||||
|
multidsn := makedsn(dsn.Delivered)
|
||||||
|
multidsn.Recipients = append(multidsn.Recipients, multidsn.Recipients[0])
|
||||||
|
msgmultidsn := dsncompose(multidsn)
|
||||||
|
testIncoming(acc, msgmultidsn, "unique", false, nil)
|
||||||
|
testIncoming(accret, msgmultidsn, "unique", false, &webhook.Outgoing{
|
||||||
|
Event: webhook.EventUnrecognized,
|
||||||
|
DSN: true,
|
||||||
|
FromID: "unique",
|
||||||
|
})
|
||||||
|
|
||||||
|
msgdelayed := dsncompose(makedsn(dsn.Delayed))
|
||||||
|
testIncoming(acc, msgdelayed, "unique", false, nil)
|
||||||
|
testIncoming(accret, msgdelayed, "unique", false, &webhook.Outgoing{
|
||||||
|
Event: webhook.EventDelayed,
|
||||||
|
DSN: true,
|
||||||
|
FromID: "unique",
|
||||||
|
SMTPCode: 554,
|
||||||
|
SMTPEnhancedCode: "5.0.0",
|
||||||
|
})
|
||||||
|
|
||||||
|
msgrelayed := dsncompose(makedsn(dsn.Relayed))
|
||||||
|
testIncoming(acc, msgrelayed, "unique", false, nil)
|
||||||
|
testIncoming(accret, msgrelayed, "unique", false, &webhook.Outgoing{
|
||||||
|
Event: webhook.EventRelayed,
|
||||||
|
DSN: true,
|
||||||
|
FromID: "unique",
|
||||||
|
SMTPCode: 554,
|
||||||
|
SMTPEnhancedCode: "5.0.0",
|
||||||
|
})
|
||||||
|
|
||||||
|
msgunrecognized := dsncompose(makedsn(dsn.Action("bogus")))
|
||||||
|
testIncoming(acc, msgunrecognized, "unique", false, nil)
|
||||||
|
testIncoming(accret, msgunrecognized, "unique", false, &webhook.Outgoing{
|
||||||
|
Event: webhook.EventUnrecognized,
|
||||||
|
DSN: true,
|
||||||
|
FromID: "unique",
|
||||||
|
})
|
||||||
|
|
||||||
|
// Not a DSN but to fromid address also causes "unrecognized".
|
||||||
|
msgunrecognized2 := []byte(testmsg)
|
||||||
|
testIncoming(acc, msgunrecognized2, "unique", false, nil)
|
||||||
|
testIncoming(accret, msgunrecognized2, "unique", false, &webhook.Outgoing{
|
||||||
|
Event: webhook.EventUnrecognized,
|
||||||
|
DSN: false,
|
||||||
|
FromID: "unique",
|
||||||
|
})
|
||||||
|
|
||||||
|
msgdelivered := dsncompose(makedsn(dsn.Delivered))
|
||||||
|
testIncoming(acc, msgdelivered, "unique", false, nil)
|
||||||
|
testIncoming(accret, msgdelivered, "unique", false, &webhook.Outgoing{
|
||||||
|
Event: webhook.EventDelivered,
|
||||||
|
DSN: true,
|
||||||
|
FromID: "unique",
|
||||||
|
// This is what DSN claims.
|
||||||
|
SMTPCode: 554,
|
||||||
|
SMTPEnhancedCode: "5.0.0",
|
||||||
|
})
|
||||||
|
|
||||||
|
testIncoming(acc, msgfailed, "unique", false, nil)
|
||||||
|
testIncoming(accret, msgfailed, "unique", false, &webhook.Outgoing{
|
||||||
|
Event: webhook.EventFailed,
|
||||||
|
DSN: true,
|
||||||
|
FromID: "unique",
|
||||||
|
SMTPCode: 554,
|
||||||
|
SMTPEnhancedCode: "5.0.0",
|
||||||
|
})
|
||||||
|
|
||||||
|
// We still have a webhook in the queue from the test above.
|
||||||
|
// Try to get the hook delivered. We'll try various error handling cases and superseding.
|
||||||
|
|
||||||
|
qsize, err := HookQueueSize(ctxbg)
|
||||||
|
tcheck(t, err, "hook queue size")
|
||||||
|
tcompare(t, qsize, 1)
|
||||||
|
|
||||||
|
var handler http.HandlerFunc
|
||||||
|
handleError := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
w.WriteHeader(http.StatusInternalServerError)
|
||||||
|
fmt.Fprintln(w, "server error")
|
||||||
|
})
|
||||||
|
handleOK := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
if r.Header.Get("Authorization") != "Basic dXNlcm5hbWU6cGFzc3dvcmQ=" {
|
||||||
|
http.Error(w, "unauthorized", http.StatusUnauthorized)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if r.Header.Get("X-Mox-Webhook-ID") == "" {
|
||||||
|
http.Error(w, "missing header x-mox-webhook-id", http.StatusBadRequest)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if r.Header.Get("X-Mox-Webhook-Attempt") == "" {
|
||||||
|
http.Error(w, "missing header x-mox-webhook-attempt", http.StatusBadRequest)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
fmt.Fprintln(w, "ok")
|
||||||
|
})
|
||||||
|
hs := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
handler.ServeHTTP(w, r)
|
||||||
|
}))
|
||||||
|
defer hs.Close()
|
||||||
|
|
||||||
|
h, err := bstore.QueryDB[Hook](ctxbg, DB).Get()
|
||||||
|
tcheck(t, err, "get hook from queue")
|
||||||
|
|
||||||
|
next := hookNextWork(ctxbg, pkglog, map[string]struct{}{"https://other.example/": {}})
|
||||||
|
if next > 0 {
|
||||||
|
t.Fatalf("next scheduled work should be immediate, is %v", next)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Respond with an error and see a retry is scheduled.
|
||||||
|
h.URL = hs.URL
|
||||||
|
// Update hook URL in database, so we can call hookLaunchWork. We'll call
|
||||||
|
// hookDeliver for later attempts.
|
||||||
|
err = DB.Update(ctxbg, &h)
|
||||||
|
tcheck(t, err, "update hook url")
|
||||||
|
handler = handleError
|
||||||
|
hookLaunchWork(pkglog, map[string]struct{}{"https://other.example/": {}})
|
||||||
|
<-hookDeliveryResults
|
||||||
|
err = DB.Get(ctxbg, &h)
|
||||||
|
tcheck(t, err, "get hook after failed delivery attempt")
|
||||||
|
tcompare(t, h.Attempts, 1)
|
||||||
|
tcompare(t, len(h.Results), 1)
|
||||||
|
tcompare(t, h.LastResult().Success, false)
|
||||||
|
tcompare(t, h.LastResult().Code, http.StatusInternalServerError)
|
||||||
|
tcompare(t, h.LastResult().Response, "server error\n")
|
||||||
|
|
||||||
|
next = hookNextWork(ctxbg, pkglog, map[string]struct{}{})
|
||||||
|
if next <= 0 {
|
||||||
|
t.Fatalf("next scheduled work is immediate, shoud be in the future")
|
||||||
|
}
|
||||||
|
|
||||||
|
n, err := HookNextAttemptSet(ctxbg, HookFilter{}, time.Now().Add(time.Minute))
|
||||||
|
tcheck(t, err, "schedule hook to now")
|
||||||
|
tcompare(t, n, 1)
|
||||||
|
n, err = HookNextAttemptAdd(ctxbg, HookFilter{}, -time.Minute)
|
||||||
|
tcheck(t, err, "schedule hook to now")
|
||||||
|
tcompare(t, n, 1)
|
||||||
|
next = hookNextWork(ctxbg, pkglog, map[string]struct{}{})
|
||||||
|
if next > 0 {
|
||||||
|
t.Fatalf("next scheduled work should be immediate, is %v", next)
|
||||||
|
}
|
||||||
|
|
||||||
|
handler = handleOK
|
||||||
|
hookDeliver(pkglog, h)
|
||||||
|
<-hookDeliveryResults
|
||||||
|
err = DB.Get(ctxbg, &h)
|
||||||
|
tcompare(t, err, bstore.ErrAbsent)
|
||||||
|
hr := HookRetired{ID: h.ID}
|
||||||
|
err = DB.Get(ctxbg, &hr)
|
||||||
|
tcheck(t, err, "get retired hook after delivery")
|
||||||
|
tcompare(t, hr.Attempts, 2)
|
||||||
|
tcompare(t, len(hr.Results), 2)
|
||||||
|
tcompare(t, hr.LastResult().Success, true)
|
||||||
|
tcompare(t, hr.LastResult().Code, http.StatusOK)
|
||||||
|
tcompare(t, hr.LastResult().Response, "ok\n")
|
||||||
|
|
||||||
|
// Check that cleaning up retired webhooks works.
|
||||||
|
cleanupHookRetiredSingle(pkglog)
|
||||||
|
hrl, err := bstore.QueryDB[HookRetired](ctxbg, DB).List()
|
||||||
|
tcheck(t, err, "listing retired hooks")
|
||||||
|
tcompare(t, len(hrl), 0)
|
||||||
|
|
||||||
|
// Helper to get a representative webhook added to the queue.
|
||||||
|
addHook := func(a *store.Account) {
|
||||||
|
testIncoming(a, msgfailed, "unique", false, &webhook.Outgoing{
|
||||||
|
Event: webhook.EventFailed,
|
||||||
|
DSN: true,
|
||||||
|
FromID: "unique",
|
||||||
|
SMTPCode: 554,
|
||||||
|
SMTPEnhancedCode: "5.0.0",
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Keep attempting and failing delivery until we give up.
|
||||||
|
addHook(accret)
|
||||||
|
h, err = bstore.QueryDB[Hook](ctxbg, DB).Get()
|
||||||
|
tcheck(t, err, "get added hook")
|
||||||
|
h.URL = hs.URL
|
||||||
|
handler = handleError
|
||||||
|
for i := 0; i < len(hookIntervals); i++ {
|
||||||
|
hookDeliver(pkglog, h)
|
||||||
|
<-hookDeliveryResults
|
||||||
|
err := DB.Get(ctxbg, &h)
|
||||||
|
tcheck(t, err, "get hook")
|
||||||
|
tcompare(t, h.Attempts, i+1)
|
||||||
|
}
|
||||||
|
// Final attempt.
|
||||||
|
hookDeliver(pkglog, h)
|
||||||
|
<-hookDeliveryResults
|
||||||
|
err = DB.Get(ctxbg, &h)
|
||||||
|
tcompare(t, err, bstore.ErrAbsent)
|
||||||
|
hr = HookRetired{ID: h.ID}
|
||||||
|
err = DB.Get(ctxbg, &hr)
|
||||||
|
tcheck(t, err, "get retired hook after failure")
|
||||||
|
tcompare(t, hr.Attempts, len(hookIntervals)+1)
|
||||||
|
tcompare(t, len(hr.Results), len(hookIntervals)+1)
|
||||||
|
tcompare(t, hr.LastResult().Success, false)
|
||||||
|
tcompare(t, hr.LastResult().Code, http.StatusInternalServerError)
|
||||||
|
tcompare(t, hr.LastResult().Response, "server error\n")
|
||||||
|
|
||||||
|
// Check account "hook" doesn't get retired webhooks.
|
||||||
|
addHook(acchook)
|
||||||
|
h, err = bstore.QueryDB[Hook](ctxbg, DB).Get()
|
||||||
|
tcheck(t, err, "get added hook")
|
||||||
|
handler = handleOK
|
||||||
|
h.URL = hs.URL
|
||||||
|
hookDeliver(pkglog, h)
|
||||||
|
<-hookDeliveryResults
|
||||||
|
err = DB.Get(ctxbg, &h)
|
||||||
|
tcompare(t, err, bstore.ErrAbsent)
|
||||||
|
hr = HookRetired{ID: h.ID}
|
||||||
|
err = DB.Get(ctxbg, &hr)
|
||||||
|
tcompare(t, err, bstore.ErrAbsent)
|
||||||
|
|
||||||
|
// HookCancel
|
||||||
|
addHook(accret)
|
||||||
|
h, err = bstore.QueryDB[Hook](ctxbg, DB).Get()
|
||||||
|
tcheck(t, err, "get added hook")
|
||||||
|
n, err = HookCancel(ctxbg, pkglog, HookFilter{})
|
||||||
|
tcheck(t, err, "canceling hook")
|
||||||
|
tcompare(t, n, 1)
|
||||||
|
l, err := HookList(ctxbg, HookFilter{}, HookSort{})
|
||||||
|
tcheck(t, err, "list hook")
|
||||||
|
tcompare(t, len(l), 0)
|
||||||
|
|
||||||
|
// Superseding: When a webhook is scheduled for a message that already has a
|
||||||
|
// pending webhook, the previous webhook should be removed/retired.
|
||||||
|
_, err = bstore.QueryDB[HookRetired](ctxbg, DB).Delete()
|
||||||
|
tcheck(t, err, "clean up retired webhooks")
|
||||||
|
_, err = bstore.QueryDB[MsgRetired](ctxbg, DB).Delete()
|
||||||
|
tcheck(t, err, "clean up retired messages")
|
||||||
|
qmr := MsgRetired{
|
||||||
|
SenderAccount: accret.Name,
|
||||||
|
SenderLocalpart: "sender",
|
||||||
|
SenderDomainStr: "remote.example",
|
||||||
|
RecipientLocalpart: "rcpt",
|
||||||
|
RecipientDomain: path.IPDomain,
|
||||||
|
RecipientDomainStr: "mox.example",
|
||||||
|
RecipientAddress: "rcpt@mox.example",
|
||||||
|
Success: true,
|
||||||
|
KeepUntil: now.Add(time.Minute),
|
||||||
|
FromID: "unique",
|
||||||
|
}
|
||||||
|
err = DB.Insert(ctxbg, &qmr)
|
||||||
|
tcheck(t, err, "insert retired message to match")
|
||||||
|
m.RcptToLocalpart = "mjl"
|
||||||
|
m.Size = int64(len(msgdelayed))
|
||||||
|
m.RcptToLocalpart += smtp.Localpart("+unique")
|
||||||
|
|
||||||
|
mr := bytes.NewReader(msgdelayed)
|
||||||
|
part, err := message.EnsurePart(pkglog.Logger, true, mr, int64(len(msgdelayed)))
|
||||||
|
tcheck(t, err, "parsing message")
|
||||||
|
|
||||||
|
// Cause first webhook.
|
||||||
|
err = Incoming(ctxbg, pkglog, accret, "<random@localhost>", m, part, "Inbox")
|
||||||
|
tcheck(t, err, "pass incoming message")
|
||||||
|
h, err = bstore.QueryDB[Hook](ctxbg, DB).Get()
|
||||||
|
tcheck(t, err, "get hook")
|
||||||
|
|
||||||
|
// Cause second webhook for same message. First should now be retired and marked as superseded.
|
||||||
|
err = Incoming(ctxbg, pkglog, accret, "<random@localhost>", m, part, "Inbox")
|
||||||
|
tcheck(t, err, "pass incoming message again")
|
||||||
|
h2, err := bstore.QueryDB[Hook](ctxbg, DB).Get()
|
||||||
|
tcheck(t, err, "get hook")
|
||||||
|
hr, err = bstore.QueryDB[HookRetired](ctxbg, DB).Get()
|
||||||
|
tcheck(t, err, "get retired hook")
|
||||||
|
tcompare(t, h.ID, hr.ID)
|
||||||
|
tcompare(t, hr.SupersededByID, h2.ID)
|
||||||
|
tcompare(t, h2.ID > h.ID, true)
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestHookListFilterSort(t *testing.T) {
|
||||||
|
_, cleanup := setup(t)
|
||||||
|
defer cleanup()
|
||||||
|
err := Init()
|
||||||
|
tcheck(t, err, "queue init")
|
||||||
|
|
||||||
|
now := time.Now().Round(0)
|
||||||
|
h := Hook{0, 0, "fromid", "messageid", "subj", nil, "mjl", "http://localhost", "", false, "delivered", "", now, 0, now, []HookResult{}}
|
||||||
|
h1 := h
|
||||||
|
h1.Submitted = now.Add(-time.Second)
|
||||||
|
h1.NextAttempt = now.Add(time.Minute)
|
||||||
|
hl := []Hook{h, h, h, h, h, h1}
|
||||||
|
err = DB.Write(ctxbg, func(tx *bstore.Tx) error {
|
||||||
|
for i := range hl {
|
||||||
|
err := hookInsert(tx, &hl[i], now, time.Minute)
|
||||||
|
tcheck(t, err, "insert hook")
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
tcheck(t, err, "inserting hooks")
|
||||||
|
h1 = hl[len(hl)-1]
|
||||||
|
|
||||||
|
hlrev := slices.Clone(hl)
|
||||||
|
slices.Reverse(hlrev)
|
||||||
|
|
||||||
|
// Ascending by nextattempt,id.
|
||||||
|
l, err := HookList(ctxbg, HookFilter{}, HookSort{Asc: true})
|
||||||
|
tcheck(t, err, "list")
|
||||||
|
tcompare(t, l, hl)
|
||||||
|
|
||||||
|
// Descending by nextattempt,id.
|
||||||
|
l, err = HookList(ctxbg, HookFilter{}, HookSort{})
|
||||||
|
tcheck(t, err, "list")
|
||||||
|
tcompare(t, l, hlrev)
|
||||||
|
|
||||||
|
// Descending by submitted,id.
|
||||||
|
l, err = HookList(ctxbg, HookFilter{}, HookSort{Field: "Submitted"})
|
||||||
|
tcheck(t, err, "list")
|
||||||
|
ll := append(append([]Hook{}, hlrev[1:]...), hl[5])
|
||||||
|
tcompare(t, l, ll)
|
||||||
|
|
||||||
|
// Filter by all fields to get a single.
|
||||||
|
allfilters := HookFilter{
|
||||||
|
Max: 2,
|
||||||
|
IDs: []int64{h1.ID},
|
||||||
|
Account: "mjl",
|
||||||
|
Submitted: "<1s",
|
||||||
|
NextAttempt: ">1s",
|
||||||
|
Event: "delivered",
|
||||||
|
}
|
||||||
|
l, err = HookList(ctxbg, allfilters, HookSort{})
|
||||||
|
tcheck(t, err, "list single")
|
||||||
|
tcompare(t, l, []Hook{h1})
|
||||||
|
|
||||||
|
// Paginated NextAttmpt asc.
|
||||||
|
var lastID int64
|
||||||
|
var last any
|
||||||
|
l = nil
|
||||||
|
for {
|
||||||
|
nl, err := HookList(ctxbg, HookFilter{Max: 1}, HookSort{Asc: true, LastID: lastID, Last: last})
|
||||||
|
tcheck(t, err, "list paginated")
|
||||||
|
l = append(l, nl...)
|
||||||
|
if len(nl) == 0 {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
tcompare(t, len(nl), 1)
|
||||||
|
lastID, last = nl[0].ID, nl[0].NextAttempt.Format(time.RFC3339Nano)
|
||||||
|
}
|
||||||
|
tcompare(t, l, hl)
|
||||||
|
|
||||||
|
// Paginated NextAttempt desc.
|
||||||
|
l = nil
|
||||||
|
lastID = 0
|
||||||
|
last = ""
|
||||||
|
for {
|
||||||
|
nl, err := HookList(ctxbg, HookFilter{Max: 1}, HookSort{LastID: lastID, Last: last})
|
||||||
|
tcheck(t, err, "list paginated")
|
||||||
|
l = append(l, nl...)
|
||||||
|
if len(nl) == 0 {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
tcompare(t, len(nl), 1)
|
||||||
|
lastID, last = nl[0].ID, nl[0].NextAttempt.Format(time.RFC3339Nano)
|
||||||
|
}
|
||||||
|
tcompare(t, l, hlrev)
|
||||||
|
|
||||||
|
// Paginated Submitted desc.
|
||||||
|
l = nil
|
||||||
|
lastID = 0
|
||||||
|
last = ""
|
||||||
|
for {
|
||||||
|
nl, err := HookList(ctxbg, HookFilter{Max: 1}, HookSort{Field: "Submitted", LastID: lastID, Last: last})
|
||||||
|
tcheck(t, err, "list paginated")
|
||||||
|
l = append(l, nl...)
|
||||||
|
if len(nl) == 0 {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
tcompare(t, len(nl), 1)
|
||||||
|
lastID, last = nl[0].ID, nl[0].Submitted.Format(time.RFC3339Nano)
|
||||||
|
}
|
||||||
|
tcompare(t, l, ll)
|
||||||
|
|
||||||
|
// Paginated Submitted asc.
|
||||||
|
l = nil
|
||||||
|
lastID = 0
|
||||||
|
last = ""
|
||||||
|
for {
|
||||||
|
nl, err := HookList(ctxbg, HookFilter{Max: 1}, HookSort{Field: "Submitted", Asc: true, LastID: lastID, Last: last})
|
||||||
|
tcheck(t, err, "list paginated")
|
||||||
|
l = append(l, nl...)
|
||||||
|
if len(nl) == 0 {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
tcompare(t, len(nl), 1)
|
||||||
|
lastID, last = nl[0].ID, nl[0].Submitted.Format(time.RFC3339Nano)
|
||||||
|
}
|
||||||
|
llrev := slices.Clone(ll)
|
||||||
|
slices.Reverse(llrev)
|
||||||
|
tcompare(t, l, llrev)
|
||||||
|
|
||||||
|
// Retire messages and do similar but more basic tests. The code is similar.
|
||||||
|
var hrl []HookRetired
|
||||||
|
err = DB.Write(ctxbg, func(tx *bstore.Tx) error {
|
||||||
|
for _, h := range hl {
|
||||||
|
hr := h.Retired(false, h.NextAttempt, time.Now().Add(time.Minute).Round(0))
|
||||||
|
err := tx.Insert(&hr)
|
||||||
|
tcheck(t, err, "inserting retired")
|
||||||
|
hrl = append(hrl, hr)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
tcheck(t, err, "adding retired")
|
||||||
|
|
||||||
|
// Paginated LastActivity desc.
|
||||||
|
var lr []HookRetired
|
||||||
|
lastID = 0
|
||||||
|
last = ""
|
||||||
|
l = nil
|
||||||
|
for {
|
||||||
|
nl, err := HookRetiredList(ctxbg, HookRetiredFilter{Max: 1}, HookRetiredSort{LastID: lastID, Last: last})
|
||||||
|
tcheck(t, err, "list paginated")
|
||||||
|
lr = append(lr, nl...)
|
||||||
|
if len(nl) == 0 {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
tcompare(t, len(nl), 1)
|
||||||
|
lastID, last = nl[0].ID, nl[0].LastActivity.Format(time.RFC3339Nano)
|
||||||
|
}
|
||||||
|
hrlrev := slices.Clone(hrl)
|
||||||
|
slices.Reverse(hrlrev)
|
||||||
|
tcompare(t, lr, hrlrev)
|
||||||
|
|
||||||
|
// Filter by all fields to get a single.
|
||||||
|
allretiredfilters := HookRetiredFilter{
|
||||||
|
Max: 2,
|
||||||
|
IDs: []int64{hrlrev[0].ID},
|
||||||
|
Account: "mjl",
|
||||||
|
Submitted: "<1s",
|
||||||
|
LastActivity: ">1s",
|
||||||
|
Event: "delivered",
|
||||||
|
}
|
||||||
|
lr, err = HookRetiredList(ctxbg, allretiredfilters, HookRetiredSort{})
|
||||||
|
tcheck(t, err, "list single")
|
||||||
|
tcompare(t, lr, []HookRetired{hrlrev[0]})
|
||||||
|
}
|
1104
queue/queue.go
1104
queue/queue.go
File diff suppressed because it is too large
Load diff
File diff suppressed because it is too large
Load diff
|
@ -13,6 +13,8 @@ import (
|
||||||
"slices"
|
"slices"
|
||||||
"time"
|
"time"
|
||||||
|
|
||||||
|
"github.com/mjl-/bstore"
|
||||||
|
|
||||||
"github.com/mjl-/mox/config"
|
"github.com/mjl-/mox/config"
|
||||||
"github.com/mjl-/mox/dns"
|
"github.com/mjl-/mox/dns"
|
||||||
"github.com/mjl-/mox/dsn"
|
"github.com/mjl-/mox/dsn"
|
||||||
|
@ -22,6 +24,7 @@ import (
|
||||||
"github.com/mjl-/mox/smtp"
|
"github.com/mjl-/mox/smtp"
|
||||||
"github.com/mjl-/mox/smtpclient"
|
"github.com/mjl-/mox/smtpclient"
|
||||||
"github.com/mjl-/mox/store"
|
"github.com/mjl-/mox/store"
|
||||||
|
"github.com/mjl-/mox/webhook"
|
||||||
)
|
)
|
||||||
|
|
||||||
// todo: reuse connection? do fewer concurrently (other than with direct delivery).
|
// todo: reuse connection? do fewer concurrently (other than with direct delivery).
|
||||||
|
@ -91,7 +94,7 @@ func deliverSubmit(qlog mlog.Log, resolver dns.Resolver, dialer smtpclient.Diale
|
||||||
Secode: smtp.SePol7MissingReqTLS30,
|
Secode: smtp.SePol7MissingReqTLS30,
|
||||||
Err: fmt.Errorf("transport %s: message requires verified tls but transport does not verify tls", transportName),
|
Err: fmt.Errorf("transport %s: message requires verified tls but transport does not verify tls", transportName),
|
||||||
}
|
}
|
||||||
fail(ctx, qlog, msgs, m0.DialedIPs, backoff, dsn.NameIP{}, submiterr)
|
failMsgsDB(qlog, msgs, m0.DialedIPs, backoff, dsn.NameIP{}, submiterr)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -126,7 +129,7 @@ func deliverSubmit(qlog mlog.Log, resolver dns.Resolver, dialer smtpclient.Diale
|
||||||
}
|
}
|
||||||
qlog.Errorx("dialing for submission", err, slog.String("remote", addr))
|
qlog.Errorx("dialing for submission", err, slog.String("remote", addr))
|
||||||
submiterr = fmt.Errorf("transport %s: dialing %s for submission: %w", transportName, addr, err)
|
submiterr = fmt.Errorf("transport %s: dialing %s for submission: %w", transportName, addr, err)
|
||||||
fail(ctx, qlog, msgs, m0.DialedIPs, backoff, dsn.NameIP{}, submiterr)
|
failMsgsDB(qlog, msgs, m0.DialedIPs, backoff, dsn.NameIP{}, submiterr)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
dialcancel()
|
dialcancel()
|
||||||
|
@ -183,7 +186,7 @@ func deliverSubmit(qlog mlog.Log, resolver dns.Resolver, dialer smtpclient.Diale
|
||||||
submiterr = smtperr
|
submiterr = smtperr
|
||||||
}
|
}
|
||||||
qlog.Errorx("establishing smtp session for submission", submiterr, slog.String("remote", addr))
|
qlog.Errorx("establishing smtp session for submission", submiterr, slog.String("remote", addr))
|
||||||
fail(ctx, qlog, msgs, m0.DialedIPs, backoff, remoteMTA, submiterr)
|
failMsgsDB(qlog, msgs, m0.DialedIPs, backoff, remoteMTA, submiterr)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
defer func() {
|
defer func() {
|
||||||
|
@ -208,7 +211,7 @@ func deliverSubmit(qlog mlog.Log, resolver dns.Resolver, dialer smtpclient.Diale
|
||||||
if err != nil {
|
if err != nil {
|
||||||
qlog.Errorx("opening message for delivery", err, slog.String("remote", addr), slog.String("path", p))
|
qlog.Errorx("opening message for delivery", err, slog.String("remote", addr), slog.String("path", p))
|
||||||
submiterr = fmt.Errorf("transport %s: opening message file for submission: %w", transportName, err)
|
submiterr = fmt.Errorf("transport %s: opening message file for submission: %w", transportName, err)
|
||||||
fail(ctx, qlog, msgs, m0.DialedIPs, backoff, dsn.NameIP{}, submiterr)
|
failMsgsDB(qlog, msgs, m0.DialedIPs, backoff, dsn.NameIP{}, submiterr)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
msgr = store.FileMsgReader(m0.MsgPrefix, f)
|
msgr = store.FileMsgReader(m0.MsgPrefix, f)
|
||||||
|
@ -229,7 +232,7 @@ func deliverSubmit(qlog mlog.Log, resolver dns.Resolver, dialer smtpclient.Diale
|
||||||
qlog.Infox("smtp transaction for delivery failed", submiterr)
|
qlog.Infox("smtp transaction for delivery failed", submiterr)
|
||||||
}
|
}
|
||||||
failed = 0 // Reset, we are looking at the SMTP results below.
|
failed = 0 // Reset, we are looking at the SMTP results below.
|
||||||
var delIDs []int64
|
var delMsgs []Msg
|
||||||
for i, m := range msgs {
|
for i, m := range msgs {
|
||||||
qmlog := qlog.With(
|
qmlog := qlog.With(
|
||||||
slog.Int64("msgid", m.ID),
|
slog.Int64("msgid", m.ID),
|
||||||
|
@ -251,17 +254,24 @@ func deliverSubmit(qlog mlog.Log, resolver dns.Resolver, dialer smtpclient.Diale
|
||||||
err = smtperr
|
err = smtperr
|
||||||
}
|
}
|
||||||
qmlog.Errorx("submitting message", err, slog.String("remote", addr))
|
qmlog.Errorx("submitting message", err, slog.String("remote", addr))
|
||||||
fail(ctx, qmlog, []*Msg{m}, m0.DialedIPs, backoff, remoteMTA, err)
|
failMsgsDB(qmlog, []*Msg{m}, m0.DialedIPs, backoff, remoteMTA, err)
|
||||||
failed++
|
failed++
|
||||||
} else {
|
} else {
|
||||||
delIDs = append(delIDs, m.ID)
|
m.markResult(0, "", "", true)
|
||||||
|
delMsgs = append(delMsgs, *m)
|
||||||
qmlog.Info("delivered from queue with transport")
|
qmlog.Info("delivered from queue with transport")
|
||||||
delivered++
|
delivered++
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if len(delIDs) > 0 {
|
if len(delMsgs) > 0 {
|
||||||
if err := queueDelete(context.Background(), delIDs...); err != nil {
|
err := DB.Write(context.Background(), func(tx *bstore.Tx) error {
|
||||||
qlog.Errorx("deleting message from queue after delivery", err)
|
return retireMsgs(qlog, tx, webhook.EventDelivered, 0, "", nil, delMsgs...)
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
qlog.Errorx("remove queue message from database after delivery", err)
|
||||||
|
} else if err := removeMsgsFS(qlog, delMsgs...); err != nil {
|
||||||
|
qlog.Errorx("remove queue message from file system after delivery", err)
|
||||||
}
|
}
|
||||||
|
kick()
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
170
queue/suppression.go
Normal file
170
queue/suppression.go
Normal file
|
@ -0,0 +1,170 @@
|
||||||
|
package queue
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"log/slog"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"github.com/mjl-/bstore"
|
||||||
|
|
||||||
|
"github.com/mjl-/mox/mlog"
|
||||||
|
"github.com/mjl-/mox/smtp"
|
||||||
|
"github.com/mjl-/mox/webapi"
|
||||||
|
)
|
||||||
|
|
||||||
|
// todo: we should be processing spam complaints and add addresses to the list.
|
||||||
|
|
||||||
|
var errSuppressed = errors.New("address is on suppression list")
|
||||||
|
|
||||||
|
func baseAddress(a smtp.Path) smtp.Path {
|
||||||
|
s := string(a.Localpart)
|
||||||
|
s, _, _ = strings.Cut(s, "+")
|
||||||
|
s, _, _ = strings.Cut(s, "-")
|
||||||
|
s = strings.ReplaceAll(s, ".", "")
|
||||||
|
s = strings.ToLower(s)
|
||||||
|
return smtp.Path{Localpart: smtp.Localpart(s), IPDomain: a.IPDomain}
|
||||||
|
}
|
||||||
|
|
||||||
|
// SuppressionList returns suppression. If account is not empty, only suppression
|
||||||
|
// for that account are returned.
|
||||||
|
//
|
||||||
|
// SuppressionList does not check if an account exists.
|
||||||
|
func SuppressionList(ctx context.Context, account string) ([]webapi.Suppression, error) {
|
||||||
|
q := bstore.QueryDB[webapi.Suppression](ctx, DB)
|
||||||
|
if account != "" {
|
||||||
|
q.FilterNonzero(webapi.Suppression{Account: account})
|
||||||
|
}
|
||||||
|
return q.List()
|
||||||
|
}
|
||||||
|
|
||||||
|
// SuppressionLookup looks up a suppression for an address for an account. Returns
|
||||||
|
// a nil suppression if not found.
|
||||||
|
//
|
||||||
|
// SuppressionLookup does not check if an account exists.
|
||||||
|
func SuppressionLookup(ctx context.Context, account string, address smtp.Path) (*webapi.Suppression, error) {
|
||||||
|
baseAddr := baseAddress(address).XString(true)
|
||||||
|
q := bstore.QueryDB[webapi.Suppression](ctx, DB)
|
||||||
|
q.FilterNonzero(webapi.Suppression{Account: account, BaseAddress: baseAddr})
|
||||||
|
sup, err := q.Get()
|
||||||
|
if err == bstore.ErrAbsent {
|
||||||
|
return nil, nil
|
||||||
|
}
|
||||||
|
return &sup, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// SuppressionAdd adds a suppression for an address for an account, setting
|
||||||
|
// BaseAddress based on OriginalAddress.
|
||||||
|
//
|
||||||
|
// If the base address of original address is already present, an error is
|
||||||
|
// returned (such as from bstore).
|
||||||
|
//
|
||||||
|
// SuppressionAdd does not check if an account exists.
|
||||||
|
func SuppressionAdd(ctx context.Context, originalAddress smtp.Path, sup *webapi.Suppression) error {
|
||||||
|
sup.BaseAddress = baseAddress(originalAddress).XString(true)
|
||||||
|
sup.OriginalAddress = originalAddress.XString(true)
|
||||||
|
return DB.Insert(ctx, sup)
|
||||||
|
}
|
||||||
|
|
||||||
|
// SuppressionRemove removes a suppression. The base address for the the given
|
||||||
|
// address is removed.
|
||||||
|
//
|
||||||
|
// SuppressionRemove does not check if an account exists.
|
||||||
|
func SuppressionRemove(ctx context.Context, account string, address smtp.Path) error {
|
||||||
|
baseAddr := baseAddress(address).XString(true)
|
||||||
|
q := bstore.QueryDB[webapi.Suppression](ctx, DB)
|
||||||
|
q.FilterNonzero(webapi.Suppression{Account: account, BaseAddress: baseAddr})
|
||||||
|
n, err := q.Delete()
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if n == 0 {
|
||||||
|
return bstore.ErrAbsent
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
type suppressionCheck struct {
|
||||||
|
MsgID int64
|
||||||
|
Account string
|
||||||
|
Recipient smtp.Path
|
||||||
|
Code int
|
||||||
|
Secode string
|
||||||
|
Source string
|
||||||
|
}
|
||||||
|
|
||||||
|
// process failures, possibly creating suppressions.
|
||||||
|
func suppressionProcess(log mlog.Log, tx *bstore.Tx, scl ...suppressionCheck) (suppressedMsgIDs []int64, err error) {
|
||||||
|
for _, sc := range scl {
|
||||||
|
xlog := log.With(slog.Any("suppressioncheck", sc))
|
||||||
|
baseAddr := baseAddress(sc.Recipient).XString(true)
|
||||||
|
exists, err := bstore.QueryTx[webapi.Suppression](tx).FilterNonzero(webapi.Suppression{Account: sc.Account, BaseAddress: baseAddr}).Exists()
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("checking if address is in suppression list: %v", err)
|
||||||
|
} else if exists {
|
||||||
|
xlog.Debug("address already in suppression list")
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
origAddr := sc.Recipient.XString(true)
|
||||||
|
sup := webapi.Suppression{
|
||||||
|
Account: sc.Account,
|
||||||
|
BaseAddress: baseAddr,
|
||||||
|
OriginalAddress: origAddr,
|
||||||
|
}
|
||||||
|
|
||||||
|
if isImmedateBlock(sc.Code, sc.Secode) {
|
||||||
|
sup.Reason = fmt.Sprintf("delivery failure from %s with smtp code %d, enhanced code %q", sc.Source, sc.Code, sc.Secode)
|
||||||
|
} else {
|
||||||
|
// If two most recent deliveries failed (excluding this one, so three most recent
|
||||||
|
// messages including this one), we'll add the address to the list.
|
||||||
|
q := bstore.QueryTx[MsgRetired](tx)
|
||||||
|
q.FilterNonzero(MsgRetired{RecipientAddress: origAddr})
|
||||||
|
q.FilterNotEqual("ID", sc.MsgID)
|
||||||
|
q.SortDesc("LastActivity")
|
||||||
|
q.Limit(2)
|
||||||
|
l, err := q.List()
|
||||||
|
if err != nil {
|
||||||
|
xlog.Errorx("checking for previous delivery failures", err)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
if len(l) < 2 || l[0].Success || l[1].Success {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
sup.Reason = fmt.Sprintf("delivery failure from %s and three consecutive failures", sc.Source)
|
||||||
|
}
|
||||||
|
if err := tx.Insert(&sup); err != nil {
|
||||||
|
return nil, fmt.Errorf("inserting suppression: %v", err)
|
||||||
|
}
|
||||||
|
suppressedMsgIDs = append(suppressedMsgIDs, sc.MsgID)
|
||||||
|
}
|
||||||
|
return suppressedMsgIDs, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decide whether an SMTP code and short enhanced code is a reason for an
|
||||||
|
// immediate suppression listing. For some errors, we don't want to bother the
|
||||||
|
// remote mail server again, or they may decide our behaviour looks spammy.
|
||||||
|
func isImmedateBlock(code int, secode string) bool {
|
||||||
|
switch code {
|
||||||
|
case smtp.C521HostNoMail, // Host is not interested in accepting email at all.
|
||||||
|
smtp.C550MailboxUnavail, // Likely mailbox does not exist.
|
||||||
|
smtp.C551UserNotLocal, // Also not interested in accepting email for this address.
|
||||||
|
smtp.C553BadMailbox, // We are sending a mailbox name that server doesn't understand and won't accept email for.
|
||||||
|
smtp.C556DomainNoMail: // Remote is not going to accept email for this address/domain.
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
if code/100 != 5 {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
switch secode {
|
||||||
|
case smtp.SeAddr1UnknownDestMailbox1, // Recipient localpart doesn't exist.
|
||||||
|
smtp.SeAddr1UnknownSystem2, // Bad recipient domain.
|
||||||
|
smtp.SeAddr1MailboxSyntax3, // Remote doesn't understand syntax.
|
||||||
|
smtp.SeAddr1DestMailboxMoved6, // Address no longer exists.
|
||||||
|
smtp.SeMailbox2Disabled1, // Account exists at remote, but is disabled.
|
||||||
|
smtp.SePol7DeliveryUnauth1: // Seems popular for saying we are on a blocklist.
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
107
queue/suppression_test.go
Normal file
107
queue/suppression_test.go
Normal file
|
@ -0,0 +1,107 @@
|
||||||
|
package queue
|
||||||
|
|
||||||
|
import (
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/mjl-/mox/smtp"
|
||||||
|
"github.com/mjl-/mox/webapi"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestSuppression(t *testing.T) {
|
||||||
|
_, cleanup := setup(t)
|
||||||
|
defer cleanup()
|
||||||
|
err := Init()
|
||||||
|
tcheck(t, err, "queue init")
|
||||||
|
|
||||||
|
l, err := SuppressionList(ctxbg, "bogus")
|
||||||
|
tcheck(t, err, "listing suppressions for unknown account")
|
||||||
|
tcompare(t, len(l), 0)
|
||||||
|
|
||||||
|
l, err = SuppressionList(ctxbg, "") // All
|
||||||
|
tcheck(t, err, "list suppression for all accounts")
|
||||||
|
tcompare(t, len(l), 0) // None yet.
|
||||||
|
|
||||||
|
addr1, err := smtp.ParseAddress("mjl@mox.example")
|
||||||
|
tcheck(t, err, "parse address")
|
||||||
|
path1 := addr1.Path()
|
||||||
|
addr2, err := smtp.ParseAddress("mjl2@mox.example")
|
||||||
|
tcheck(t, err, "parse address")
|
||||||
|
path2 := addr2.Path()
|
||||||
|
addr2b, err := smtp.ParseAddress("M.j.l2+catchall@Mox.example")
|
||||||
|
tcheck(t, err, "parse address")
|
||||||
|
path2b := addr2b.Path()
|
||||||
|
|
||||||
|
// No suppression yet.
|
||||||
|
sup, err := SuppressionLookup(ctxbg, "mjl", path1)
|
||||||
|
tcheck(t, err, "lookup suppression")
|
||||||
|
tcompare(t, sup == nil, true)
|
||||||
|
|
||||||
|
// No error if account does not exist.
|
||||||
|
sup, err = SuppressionLookup(ctxbg, "bogus", path1)
|
||||||
|
tcompare(t, err == nil, true)
|
||||||
|
tcompare(t, sup == nil, true)
|
||||||
|
|
||||||
|
// Can add a suppression once.
|
||||||
|
err = SuppressionAdd(ctxbg, path1, &webapi.Suppression{Account: "mjl"})
|
||||||
|
tcheck(t, err, "add suppression")
|
||||||
|
// No duplicates.
|
||||||
|
err = SuppressionAdd(ctxbg, path1, &webapi.Suppression{Account: "mjl"})
|
||||||
|
tcompare(t, err == nil, false)
|
||||||
|
// Account must be set in Suppresion.
|
||||||
|
err = SuppressionAdd(ctxbg, path1, &webapi.Suppression{})
|
||||||
|
tcompare(t, err == nil, false)
|
||||||
|
|
||||||
|
// Duplicate check is done after making base address.
|
||||||
|
err = SuppressionAdd(ctxbg, path2, &webapi.Suppression{Account: "retired"})
|
||||||
|
tcheck(t, err, "add suppression")
|
||||||
|
err = SuppressionAdd(ctxbg, path2b, &webapi.Suppression{Account: "retired"})
|
||||||
|
tcompare(t, err == nil, false) // Duplicate.
|
||||||
|
|
||||||
|
l, err = SuppressionList(ctxbg, "") // All
|
||||||
|
tcheck(t, err, "list suppression for all accounts")
|
||||||
|
tcompare(t, len(l), 2)
|
||||||
|
l, err = SuppressionList(ctxbg, "mjl")
|
||||||
|
tcheck(t, err, "list suppression for mjl")
|
||||||
|
tcompare(t, len(l), 1)
|
||||||
|
|
||||||
|
// path1 is listed for mjl.
|
||||||
|
sup, err = SuppressionLookup(ctxbg, "mjl", path1)
|
||||||
|
tcheck(t, err, "lookup")
|
||||||
|
tcompare(t, sup == nil, false)
|
||||||
|
|
||||||
|
// Accounts don't influence each other.
|
||||||
|
sup, err = SuppressionLookup(ctxbg, "mjl", path2)
|
||||||
|
tcheck(t, err, "lookup")
|
||||||
|
tcompare(t, sup == nil, true)
|
||||||
|
|
||||||
|
// Simplified address is present.
|
||||||
|
sup, err = SuppressionLookup(ctxbg, "retired", path2)
|
||||||
|
tcheck(t, err, "lookup")
|
||||||
|
tcompare(t, sup == nil, false)
|
||||||
|
|
||||||
|
// Original address is also present.
|
||||||
|
sup, err = SuppressionLookup(ctxbg, "retired", path2b)
|
||||||
|
tcheck(t, err, "lookup")
|
||||||
|
tcompare(t, sup == nil, false)
|
||||||
|
|
||||||
|
// Can remove again.
|
||||||
|
err = SuppressionRemove(ctxbg, "mjl", path1)
|
||||||
|
tcheck(t, err, "remove")
|
||||||
|
// But not twice.
|
||||||
|
err = SuppressionRemove(ctxbg, "mjl", path1)
|
||||||
|
tcompare(t, err == nil, false)
|
||||||
|
// No longer present.
|
||||||
|
sup, err = SuppressionLookup(ctxbg, "mjl", path1)
|
||||||
|
tcheck(t, err, "lookup")
|
||||||
|
tcompare(t, sup == nil, true)
|
||||||
|
|
||||||
|
// Can remove for any form of the address, was added as path2b.
|
||||||
|
err = SuppressionRemove(ctxbg, "retired", path2b)
|
||||||
|
tcheck(t, err, "lookup")
|
||||||
|
|
||||||
|
// Account names are not validated.
|
||||||
|
err = SuppressionAdd(ctxbg, path1, &webapi.Suppression{Account: "bogus"})
|
||||||
|
tcheck(t, err, "add suppression")
|
||||||
|
err = SuppressionRemove(ctxbg, "bogus", path1)
|
||||||
|
tcheck(t, err, "remove suppression")
|
||||||
|
}
|
|
@ -90,8 +90,8 @@ domains with HTTP/HTTPS, including with automatic TLS with ACME, is easily
|
||||||
configured through both configuration files and admin web interface, and can act
|
configured through both configuration files and admin web interface, and can act
|
||||||
as a reverse proxy (and static file server for that matter), so you can forward
|
as a reverse proxy (and static file server for that matter), so you can forward
|
||||||
traffic to your existing backend applications. Look for "WebHandlers:" in the
|
traffic to your existing backend applications. Look for "WebHandlers:" in the
|
||||||
output of "mox config describe-domains" and see the output of "mox example
|
output of "mox config describe-domains" and see the output of
|
||||||
webhandlers".
|
"mox config example webhandlers".
|
||||||
`
|
`
|
||||||
var existingWebserver bool
|
var existingWebserver bool
|
||||||
var hostname string
|
var hostname string
|
||||||
|
@ -563,7 +563,8 @@ WARNING: Could not verify outgoing smtp connections can be made, outgoing
|
||||||
delivery may not be working. Many providers block outgoing smtp connections by
|
delivery may not be working. Many providers block outgoing smtp connections by
|
||||||
default, requiring an explicit request or a cooldown period before allowing
|
default, requiring an explicit request or a cooldown period before allowing
|
||||||
outgoing smtp connections. To send through a smarthost, configure a "Transport"
|
outgoing smtp connections. To send through a smarthost, configure a "Transport"
|
||||||
in mox.conf and use it in "Routes" in domains.conf. See "mox example transport".
|
in mox.conf and use it in "Routes" in domains.conf. See
|
||||||
|
"mox config example transport".
|
||||||
|
|
||||||
`)
|
`)
|
||||||
}
|
}
|
||||||
|
@ -774,6 +775,7 @@ and check the admin page for the needed DNS records.`)
|
||||||
internal.AccountHTTP.Enabled = true
|
internal.AccountHTTP.Enabled = true
|
||||||
internal.AdminHTTP.Enabled = true
|
internal.AdminHTTP.Enabled = true
|
||||||
internal.WebmailHTTP.Enabled = true
|
internal.WebmailHTTP.Enabled = true
|
||||||
|
internal.WebAPIHTTP.Enabled = true
|
||||||
internal.MetricsHTTP.Enabled = true
|
internal.MetricsHTTP.Enabled = true
|
||||||
if existingWebserver {
|
if existingWebserver {
|
||||||
internal.AccountHTTP.Port = 1080
|
internal.AccountHTTP.Port = 1080
|
||||||
|
@ -782,6 +784,8 @@ and check the admin page for the needed DNS records.`)
|
||||||
internal.AdminHTTP.Forwarded = true
|
internal.AdminHTTP.Forwarded = true
|
||||||
internal.WebmailHTTP.Port = 1080
|
internal.WebmailHTTP.Port = 1080
|
||||||
internal.WebmailHTTP.Forwarded = true
|
internal.WebmailHTTP.Forwarded = true
|
||||||
|
internal.WebAPIHTTP.Port = 1080
|
||||||
|
internal.WebAPIHTTP.Forwarded = true
|
||||||
internal.AutoconfigHTTPS.Enabled = true
|
internal.AutoconfigHTTPS.Enabled = true
|
||||||
internal.AutoconfigHTTPS.Port = 81
|
internal.AutoconfigHTTPS.Port = 81
|
||||||
internal.AutoconfigHTTPS.NonTLS = true
|
internal.AutoconfigHTTPS.NonTLS = true
|
||||||
|
|
2
serve.go
2
serve.go
|
@ -78,7 +78,7 @@ func start(mtastsdbRefresher, sendDMARCReports, sendTLSReports, skipForkExec boo
|
||||||
return fmt.Errorf("tlsrpt init: %s", err)
|
return fmt.Errorf("tlsrpt init: %s", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
done := make(chan struct{}, 1)
|
done := make(chan struct{}, 4) // Goroutines for messages and webhooks, and cleaners.
|
||||||
if err := queue.Start(dns.StrictResolver{Pkg: "queue"}, done); err != nil {
|
if err := queue.Start(dns.StrictResolver{Pkg: "queue"}, done); err != nil {
|
||||||
return fmt.Errorf("queue start: %s", err)
|
return fmt.Errorf("queue start: %s", err)
|
||||||
}
|
}
|
||||||
|
|
|
@ -54,7 +54,7 @@ func queueDSN(ctx context.Context, log mlog.Log, c *conn, rcptTo smtp.Path, m ds
|
||||||
if requireTLS {
|
if requireTLS {
|
||||||
reqTLS = &requireTLS
|
reqTLS = &requireTLS
|
||||||
}
|
}
|
||||||
qm := queue.MakeMsg(smtp.Path{}, rcptTo, has8bit, smtputf8, int64(len(buf)), m.MessageID, nil, reqTLS, time.Now())
|
qm := queue.MakeMsg(smtp.Path{}, rcptTo, has8bit, smtputf8, int64(len(buf)), m.MessageID, nil, reqTLS, time.Now(), m.Subject)
|
||||||
qm.DSNUTF8 = bufUTF8
|
qm.DSNUTF8 = bufUTF8
|
||||||
if err := queue.Add(ctx, c.log, "", f, qm); err != nil {
|
if err := queue.Add(ctx, c.log, "", f, qm); err != nil {
|
||||||
return err
|
return err
|
||||||
|
|
|
@ -6,6 +6,7 @@ import (
|
||||||
"bytes"
|
"bytes"
|
||||||
"context"
|
"context"
|
||||||
"crypto/md5"
|
"crypto/md5"
|
||||||
|
cryptorand "crypto/rand"
|
||||||
"crypto/rsa"
|
"crypto/rsa"
|
||||||
"crypto/sha1"
|
"crypto/sha1"
|
||||||
"crypto/sha256"
|
"crypto/sha256"
|
||||||
|
@ -21,8 +22,8 @@ import (
|
||||||
"net/textproto"
|
"net/textproto"
|
||||||
"os"
|
"os"
|
||||||
"runtime/debug"
|
"runtime/debug"
|
||||||
|
"slices"
|
||||||
"sort"
|
"sort"
|
||||||
"strconv"
|
|
||||||
"strings"
|
"strings"
|
||||||
"sync"
|
"sync"
|
||||||
"time"
|
"time"
|
||||||
|
@ -150,7 +151,7 @@ var (
|
||||||
"reason",
|
"reason",
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
// Similar between ../webmail/webmail.go:/metricSubmission and ../smtpserver/server.go:/metricSubmission
|
// Similar between ../webmail/webmail.go:/metricSubmission and ../smtpserver/server.go:/metricSubmission and ../webapisrv/server.go:/metricSubmission
|
||||||
metricSubmission = promauto.NewCounterVec(
|
metricSubmission = promauto.NewCounterVec(
|
||||||
prometheus.CounterOpts{
|
prometheus.CounterOpts{
|
||||||
Name: "mox_smtpserver_submission_total",
|
Name: "mox_smtpserver_submission_total",
|
||||||
|
@ -1944,7 +1945,7 @@ func hasTLSRequiredNo(h textproto.MIMEHeader) bool {
|
||||||
|
|
||||||
// submit is used for mail from authenticated users that we will try to deliver.
|
// submit is used for mail from authenticated users that we will try to deliver.
|
||||||
func (c *conn) submit(ctx context.Context, recvHdrFor func(string) string, msgWriter *message.Writer, dataFile *os.File, part *message.Part) {
|
func (c *conn) submit(ctx context.Context, recvHdrFor func(string) string, msgWriter *message.Writer, dataFile *os.File, part *message.Part) {
|
||||||
// Similar between ../smtpserver/server.go:/submit\( and ../webmail/webmail.go:/MessageSubmit\(
|
// Similar between ../smtpserver/server.go:/submit\( and ../webmail/api.go:/MessageSubmit\( and ../webapisrv/server.go:/Send\(
|
||||||
|
|
||||||
var msgPrefix []byte
|
var msgPrefix []byte
|
||||||
|
|
||||||
|
@ -2017,6 +2018,26 @@ func (c *conn) submit(ctx context.Context, recvHdrFor func(string) string, msgWr
|
||||||
})
|
})
|
||||||
xcheckf(err, "read-only transaction")
|
xcheckf(err, "read-only transaction")
|
||||||
|
|
||||||
|
// We gather any X-Mox-Extra-* headers into the "extra" data during queueing, which
|
||||||
|
// will make it into any webhook we deliver.
|
||||||
|
// todo: remove the X-Mox-Extra-* headers from the message. we don't currently rewrite the message...
|
||||||
|
// todo: should we not canonicalize keys?
|
||||||
|
var extra map[string]string
|
||||||
|
for k, vl := range header {
|
||||||
|
if !strings.HasPrefix(k, "X-Mox-Extra-") {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
if extra == nil {
|
||||||
|
extra = map[string]string{}
|
||||||
|
}
|
||||||
|
xk := k[len("X-Mox-Extra-"):]
|
||||||
|
// We don't allow duplicate keys.
|
||||||
|
if _, ok := extra[xk]; ok || len(vl) > 1 {
|
||||||
|
xsmtpUserErrorf(smtp.C554TransactionFailed, smtp.SeMsg6Other0, "duplicate x-mox-extra- key %q", xk)
|
||||||
|
}
|
||||||
|
extra[xk] = vl[len(vl)-1]
|
||||||
|
}
|
||||||
|
|
||||||
// todo future: in a pedantic mode, we can parse the headers, and return an error if rcpt is only in To or Cc header, and not in the non-empty Bcc header. indicates a client that doesn't blind those bcc's.
|
// todo future: in a pedantic mode, we can parse the headers, and return an error if rcpt is only in To or Cc header, and not in the non-empty Bcc header. indicates a client that doesn't blind those bcc's.
|
||||||
|
|
||||||
// Add DKIM signatures.
|
// Add DKIM signatures.
|
||||||
|
@ -2054,13 +2075,23 @@ func (c *conn) submit(ctx context.Context, recvHdrFor func(string) string, msgWr
|
||||||
msgPrefix = append(msgPrefix, []byte(authResults.Header())...)
|
msgPrefix = append(msgPrefix, []byte(authResults.Header())...)
|
||||||
|
|
||||||
// We always deliver through the queue. It would be more efficient to deliver
|
// We always deliver through the queue. It would be more efficient to deliver
|
||||||
// directly, but we don't want to circumvent all the anti-spam measures. Accounts
|
// directly for local accounts, but we don't want to circumvent all the anti-spam
|
||||||
// on a single mox instance should be allowed to block each other.
|
// measures. Accounts on a single mox instance should be allowed to block each
|
||||||
|
// other.
|
||||||
|
|
||||||
|
accConf, _ := c.account.Conf()
|
||||||
|
loginAddr, err := smtp.ParseAddress(c.username)
|
||||||
|
xcheckf(err, "parsing login address")
|
||||||
|
useFromID := slices.Contains(accConf.ParsedFromIDLoginAddresses, loginAddr)
|
||||||
|
var localpartBase string
|
||||||
|
if useFromID {
|
||||||
|
localpartBase = strings.SplitN(string(c.mailFrom.Localpart), confDom.LocalpartCatchallSeparator, 2)[0]
|
||||||
|
}
|
||||||
now := time.Now()
|
now := time.Now()
|
||||||
qml := make([]queue.Msg, len(c.recipients))
|
qml := make([]queue.Msg, len(c.recipients))
|
||||||
for i, rcptAcc := range c.recipients {
|
for i, rcptAcc := range c.recipients {
|
||||||
if Localserve {
|
if Localserve {
|
||||||
code, timeout := localserveNeedsError(rcptAcc.rcptTo.Localpart)
|
code, timeout := mox.LocalserveNeedsError(rcptAcc.rcptTo.Localpart)
|
||||||
if timeout {
|
if timeout {
|
||||||
c.log.Info("timing out submission due to special localpart")
|
c.log.Info("timing out submission due to special localpart")
|
||||||
mox.Sleep(mox.Context, time.Hour)
|
mox.Sleep(mox.Context, time.Hour)
|
||||||
|
@ -2071,6 +2102,13 @@ func (c *conn) submit(ctx context.Context, recvHdrFor func(string) string, msgWr
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fp := *c.mailFrom
|
||||||
|
var fromID string
|
||||||
|
if useFromID {
|
||||||
|
fromID = xrandomID(16)
|
||||||
|
fp.Localpart = smtp.Localpart(localpartBase + confDom.LocalpartCatchallSeparator + fromID)
|
||||||
|
}
|
||||||
|
|
||||||
// For multiple recipients, we don't make each message prefix unique, leaving out
|
// For multiple recipients, we don't make each message prefix unique, leaving out
|
||||||
// the "for" clause in the Received header. This allows the queue to deliver the
|
// the "for" clause in the Received header. This allows the queue to deliver the
|
||||||
// messages in a single smtp transaction.
|
// messages in a single smtp transaction.
|
||||||
|
@ -2080,11 +2118,13 @@ func (c *conn) submit(ctx context.Context, recvHdrFor func(string) string, msgWr
|
||||||
}
|
}
|
||||||
xmsgPrefix := append([]byte(recvHdrFor(rcptTo)), msgPrefix...)
|
xmsgPrefix := append([]byte(recvHdrFor(rcptTo)), msgPrefix...)
|
||||||
msgSize := int64(len(xmsgPrefix)) + msgWriter.Size
|
msgSize := int64(len(xmsgPrefix)) + msgWriter.Size
|
||||||
qm := queue.MakeMsg(*c.mailFrom, rcptAcc.rcptTo, msgWriter.Has8bit, c.msgsmtputf8, msgSize, messageID, xmsgPrefix, c.requireTLS, now)
|
qm := queue.MakeMsg(fp, rcptAcc.rcptTo, msgWriter.Has8bit, c.msgsmtputf8, msgSize, messageID, xmsgPrefix, c.requireTLS, now, header.Get("Subject"))
|
||||||
if !c.futureRelease.IsZero() {
|
if !c.futureRelease.IsZero() {
|
||||||
qm.NextAttempt = c.futureRelease
|
qm.NextAttempt = c.futureRelease
|
||||||
qm.FutureReleaseRequest = c.futureReleaseRequest
|
qm.FutureReleaseRequest = c.futureReleaseRequest
|
||||||
}
|
}
|
||||||
|
qm.FromID = fromID
|
||||||
|
qm.Extra = extra
|
||||||
qml[i] = qm
|
qml[i] = qm
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -2124,6 +2164,20 @@ func (c *conn) submit(ctx context.Context, recvHdrFor func(string) string, msgWr
|
||||||
c.writecodeline(smtp.C250Completed, smtp.SeMailbox2Other0, "it is done", nil)
|
c.writecodeline(smtp.C250Completed, smtp.SeMailbox2Other0, "it is done", nil)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func xrandomID(n int) string {
|
||||||
|
return base64.RawURLEncoding.EncodeToString(xrandom(n))
|
||||||
|
}
|
||||||
|
|
||||||
|
func xrandom(n int) []byte {
|
||||||
|
buf := make([]byte, n)
|
||||||
|
x, err := cryptorand.Read(buf)
|
||||||
|
xcheckf(err, "read random")
|
||||||
|
if x != n {
|
||||||
|
xcheckf(errors.New("short random read"), "read random")
|
||||||
|
}
|
||||||
|
return buf
|
||||||
|
}
|
||||||
|
|
||||||
func ipmasked(ip net.IP) (string, string, string) {
|
func ipmasked(ip net.IP) (string, string, string) {
|
||||||
if ip.To4() != nil {
|
if ip.To4() != nil {
|
||||||
m1 := ip.String()
|
m1 := ip.String()
|
||||||
|
@ -2137,31 +2191,8 @@ func ipmasked(ip net.IP) (string, string, string) {
|
||||||
return m1, m2, m3
|
return m1, m2, m3
|
||||||
}
|
}
|
||||||
|
|
||||||
func localserveNeedsError(lp smtp.Localpart) (code int, timeout bool) {
|
|
||||||
s := string(lp)
|
|
||||||
if strings.HasSuffix(s, "temperror") {
|
|
||||||
return smtp.C451LocalErr, false
|
|
||||||
} else if strings.HasSuffix(s, "permerror") {
|
|
||||||
return smtp.C550MailboxUnavail, false
|
|
||||||
} else if strings.HasSuffix(s, "timeout") {
|
|
||||||
return 0, true
|
|
||||||
}
|
|
||||||
if len(s) < 3 {
|
|
||||||
return 0, false
|
|
||||||
}
|
|
||||||
s = s[len(s)-3:]
|
|
||||||
v, err := strconv.ParseInt(s, 10, 32)
|
|
||||||
if err != nil {
|
|
||||||
return 0, false
|
|
||||||
}
|
|
||||||
if v < 400 || v > 600 {
|
|
||||||
return 0, false
|
|
||||||
}
|
|
||||||
return int(v), false
|
|
||||||
}
|
|
||||||
|
|
||||||
func (c *conn) xlocalserveError(lp smtp.Localpart) {
|
func (c *conn) xlocalserveError(lp smtp.Localpart) {
|
||||||
code, timeout := localserveNeedsError(lp)
|
code, timeout := mox.LocalserveNeedsError(lp)
|
||||||
if timeout {
|
if timeout {
|
||||||
c.log.Info("timing out due to special localpart")
|
c.log.Info("timing out due to special localpart")
|
||||||
mox.Sleep(mox.Context, time.Hour)
|
mox.Sleep(mox.Context, time.Hour)
|
||||||
|
@ -2178,7 +2209,16 @@ func (c *conn) xlocalserveError(lp smtp.Localpart) {
|
||||||
func (c *conn) deliver(ctx context.Context, recvHdrFor func(string) string, msgWriter *message.Writer, iprevStatus iprev.Status, iprevAuthentic bool, dataFile *os.File) {
|
func (c *conn) deliver(ctx context.Context, recvHdrFor func(string) string, msgWriter *message.Writer, iprevStatus iprev.Status, iprevAuthentic bool, dataFile *os.File) {
|
||||||
// todo: in decision making process, if we run into (some) temporary errors, attempt to continue. if we decide to accept, all good. if we decide to reject, we'll make it a temporary reject.
|
// todo: in decision making process, if we run into (some) temporary errors, attempt to continue. if we decide to accept, all good. if we decide to reject, we'll make it a temporary reject.
|
||||||
|
|
||||||
msgFrom, envelope, headers, err := message.From(c.log.Logger, false, dataFile, nil)
|
var msgFrom smtp.Address
|
||||||
|
var envelope *message.Envelope
|
||||||
|
var headers textproto.MIMEHeader
|
||||||
|
var isDSN bool
|
||||||
|
part, err := message.Parse(c.log.Logger, false, dataFile)
|
||||||
|
if err == nil {
|
||||||
|
// todo: is it enough to check only the the content-type header? in other places we look at the content-types of the parts before considering a message a dsn. should we change other places to this simpler check?
|
||||||
|
isDSN = part.MediaType == "MULTIPART" && part.MediaSubType == "REPORT" && strings.EqualFold(part.ContentTypeParams["report-type"], "delivery-status")
|
||||||
|
msgFrom, envelope, headers, err = message.From(c.log.Logger, false, dataFile, &part)
|
||||||
|
}
|
||||||
if err != nil {
|
if err != nil {
|
||||||
c.log.Infox("parsing message for From address", err)
|
c.log.Infox("parsing message for From address", err)
|
||||||
}
|
}
|
||||||
|
@ -2676,6 +2716,7 @@ func (c *conn) deliver(ctx context.Context, recvHdrFor func(string) string, msgW
|
||||||
MailFromValidation: mailFromValidation,
|
MailFromValidation: mailFromValidation,
|
||||||
MsgFromValidation: msgFromValidation,
|
MsgFromValidation: msgFromValidation,
|
||||||
DKIMDomains: verifiedDKIMDomains,
|
DKIMDomains: verifiedDKIMDomains,
|
||||||
|
DSN: isDSN,
|
||||||
Size: msgWriter.Size,
|
Size: msgWriter.Size,
|
||||||
}
|
}
|
||||||
if c.tls {
|
if c.tls {
|
||||||
|
@ -2960,8 +3001,8 @@ func (c *conn) deliver(ctx context.Context, recvHdrFor func(string) string, msgW
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if Localserve {
|
if Localserve && !strings.HasPrefix(string(rcptAcc.rcptTo.Localpart), "queue") {
|
||||||
code, timeout := localserveNeedsError(rcptAcc.rcptTo.Localpart)
|
code, timeout := mox.LocalserveNeedsError(rcptAcc.rcptTo.Localpart)
|
||||||
if timeout {
|
if timeout {
|
||||||
log.Info("timing out due to special localpart")
|
log.Info("timing out due to special localpart")
|
||||||
mox.Sleep(mox.Context, time.Hour)
|
mox.Sleep(mox.Context, time.Hour)
|
||||||
|
@ -2972,6 +3013,7 @@ func (c *conn) deliver(ctx context.Context, recvHdrFor func(string) string, msgW
|
||||||
addError(rcptAcc, code, smtp.SeOther00, false, fmt.Sprintf("failure with code %d due to special localpart", code))
|
addError(rcptAcc, code, smtp.SeOther00, false, fmt.Sprintf("failure with code %d due to special localpart", code))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
var delivered bool
|
||||||
acc.WithWLock(func() {
|
acc.WithWLock(func() {
|
||||||
if err := acc.DeliverMailbox(log, a.mailbox, &m, dataFile); err != nil {
|
if err := acc.DeliverMailbox(log, a.mailbox, &m, dataFile); err != nil {
|
||||||
log.Errorx("delivering", err)
|
log.Errorx("delivering", err)
|
||||||
|
@ -2983,6 +3025,7 @@ func (c *conn) deliver(ctx context.Context, recvHdrFor func(string) string, msgW
|
||||||
}
|
}
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
delivered = true
|
||||||
metricDelivery.WithLabelValues("delivered", a.reason).Inc()
|
metricDelivery.WithLabelValues("delivered", a.reason).Inc()
|
||||||
log.Info("incoming message delivered", slog.String("reason", a.reason), slog.Any("msgfrom", msgFrom))
|
log.Info("incoming message delivered", slog.String("reason", a.reason), slog.Any("msgfrom", msgFrom))
|
||||||
|
|
||||||
|
@ -2994,6 +3037,18 @@ func (c *conn) deliver(ctx context.Context, recvHdrFor func(string) string, msgW
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
||||||
|
// Pass delivered messages to queue for DSN processing and/or hooks.
|
||||||
|
if delivered {
|
||||||
|
mr := store.FileMsgReader(m.MsgPrefix, dataFile)
|
||||||
|
part, err := m.LoadPart(mr)
|
||||||
|
if err != nil {
|
||||||
|
log.Errorx("loading parsed part for evaluating webhook", err)
|
||||||
|
} else {
|
||||||
|
err = queue.Incoming(context.Background(), log, acc, messageID, m, part, a.mailbox)
|
||||||
|
log.Check(err, "queueing webhook for incoming delivery")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
err = acc.Close()
|
err = acc.Close()
|
||||||
log.Check(err, "closing account after delivering")
|
log.Check(err, "closing account after delivering")
|
||||||
acc = nil
|
acc = nil
|
||||||
|
|
|
@ -143,6 +143,7 @@ func (ts *testserver) close() {
|
||||||
}
|
}
|
||||||
|
|
||||||
func (ts *testserver) run(fn func(helloErr error, client *smtpclient.Client)) {
|
func (ts *testserver) run(fn func(helloErr error, client *smtpclient.Client)) {
|
||||||
|
ts.t.Helper()
|
||||||
ts.runRaw(func(conn net.Conn) {
|
ts.runRaw(func(conn net.Conn) {
|
||||||
ts.t.Helper()
|
ts.t.Helper()
|
||||||
|
|
||||||
|
@ -1443,7 +1444,7 @@ test email
|
||||||
}
|
}
|
||||||
tcheck(t, err, "deliver")
|
tcheck(t, err, "deliver")
|
||||||
|
|
||||||
msgs, err := queue.List(ctxbg, queue.Filter{})
|
msgs, err := queue.List(ctxbg, queue.Filter{}, queue.Sort{})
|
||||||
tcheck(t, err, "listing queue")
|
tcheck(t, err, "listing queue")
|
||||||
n++
|
n++
|
||||||
tcompare(t, len(msgs), n)
|
tcompare(t, len(msgs), n)
|
||||||
|
@ -1592,7 +1593,7 @@ test email
|
||||||
}
|
}
|
||||||
tcheck(t, err, "deliver")
|
tcheck(t, err, "deliver")
|
||||||
|
|
||||||
msgs, err := queue.List(ctxbg, queue.Filter{})
|
msgs, err := queue.List(ctxbg, queue.Filter{}, queue.Sort{})
|
||||||
tcheck(t, err, "listing queue")
|
tcheck(t, err, "listing queue")
|
||||||
tcompare(t, len(msgs), 1)
|
tcompare(t, len(msgs), 1)
|
||||||
tcompare(t, msgs[0].RequireTLS, expRequireTLS)
|
tcompare(t, msgs[0].RequireTLS, expRequireTLS)
|
||||||
|
@ -1808,8 +1809,8 @@ QW4gYXR0YWNoZWQgdGV4dCBmaWxlLg==
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
msgs, _ := queue.List(ctxbg, queue.Filter{})
|
msgs, _ := queue.List(ctxbg, queue.Filter{}, queue.Sort{Field: "Queued", Asc: false})
|
||||||
queuedMsg := msgs[len(msgs)-1]
|
queuedMsg := msgs[0]
|
||||||
if queuedMsg.SMTPUTF8 != expectedSmtputf8 {
|
if queuedMsg.SMTPUTF8 != expectedSmtputf8 {
|
||||||
t.Fatalf("[%s / %s / %s / %s] got SMTPUTF8 %t, expected %t", mailFrom, rcptTo, headerValue, filename, queuedMsg.SMTPUTF8, expectedSmtputf8)
|
t.Fatalf("[%s / %s / %s / %s] got SMTPUTF8 %t, expected %t", mailFrom, rcptTo, headerValue, filename, queuedMsg.SMTPUTF8, expectedSmtputf8)
|
||||||
}
|
}
|
||||||
|
@ -1828,3 +1829,79 @@ QW4gYXR0YWNoZWQgdGV4dCBmaWxlLg==
|
||||||
test(`Ω@mox.example`, `🙂@example.org`, "header-utf8-😍", "utf8-🫠️.txt", true, true, nil)
|
test(`Ω@mox.example`, `🙂@example.org`, "header-utf8-😍", "utf8-🫠️.txt", true, true, nil)
|
||||||
test(`mjl@mox.example`, `remote@xn--vg8h.example.org`, "header-ascii", "ascii.txt", true, false, nil)
|
test(`mjl@mox.example`, `remote@xn--vg8h.example.org`, "header-ascii", "ascii.txt", true, false, nil)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// TestExtra checks whether submission of messages with "X-Mox-Extra-<key>: value"
|
||||||
|
// headers cause those those key/value pairs to be added to the Extra field in the
|
||||||
|
// queue.
|
||||||
|
func TestExtra(t *testing.T) {
|
||||||
|
ts := newTestServer(t, filepath.FromSlash("../testdata/smtp/mox.conf"), dns.MockResolver{})
|
||||||
|
defer ts.close()
|
||||||
|
|
||||||
|
ts.user = "mjl@mox.example"
|
||||||
|
ts.pass = password0
|
||||||
|
ts.submission = true
|
||||||
|
|
||||||
|
extraMsg := strings.ReplaceAll(`From: <mjl@mox.example>
|
||||||
|
To: <remote@example.org>
|
||||||
|
Subject: test
|
||||||
|
X-Mox-Extra-Test: testvalue
|
||||||
|
X-Mox-Extra-a: 123
|
||||||
|
X-Mox-Extra-☺: ☹
|
||||||
|
X-Mox-Extra-x-cANONICAL-z: ok
|
||||||
|
Message-Id: <test@mox.example>
|
||||||
|
|
||||||
|
test email
|
||||||
|
`, "\n", "\r\n")
|
||||||
|
|
||||||
|
ts.run(func(err error, client *smtpclient.Client) {
|
||||||
|
t.Helper()
|
||||||
|
tcheck(t, err, "init client")
|
||||||
|
mailFrom := "mjl@mox.example"
|
||||||
|
rcptTo := "mjl@mox.example"
|
||||||
|
err = client.Deliver(ctxbg, mailFrom, rcptTo, int64(len(extraMsg)), strings.NewReader(extraMsg), true, true, false)
|
||||||
|
tcheck(t, err, "deliver")
|
||||||
|
})
|
||||||
|
msgs, err := queue.List(ctxbg, queue.Filter{}, queue.Sort{})
|
||||||
|
tcheck(t, err, "queue list")
|
||||||
|
tcompare(t, len(msgs), 1)
|
||||||
|
tcompare(t, msgs[0].Extra, map[string]string{
|
||||||
|
"Test": "testvalue",
|
||||||
|
"A": "123",
|
||||||
|
"☺": "☹",
|
||||||
|
"X-Canonical-Z": "ok",
|
||||||
|
})
|
||||||
|
// note: these headers currently stay in the message.
|
||||||
|
}
|
||||||
|
|
||||||
|
// TestExtraDup checks for an error for duplicate x-mox-extra-* keys.
|
||||||
|
func TestExtraDup(t *testing.T) {
|
||||||
|
ts := newTestServer(t, filepath.FromSlash("../testdata/smtp/mox.conf"), dns.MockResolver{})
|
||||||
|
defer ts.close()
|
||||||
|
|
||||||
|
ts.user = "mjl@mox.example"
|
||||||
|
ts.pass = password0
|
||||||
|
ts.submission = true
|
||||||
|
|
||||||
|
extraMsg := strings.ReplaceAll(`From: <mjl@mox.example>
|
||||||
|
To: <remote@example.org>
|
||||||
|
Subject: test
|
||||||
|
X-Mox-Extra-Test: testvalue
|
||||||
|
X-Mox-Extra-Test: testvalue
|
||||||
|
Message-Id: <test@mox.example>
|
||||||
|
|
||||||
|
test email
|
||||||
|
`, "\n", "\r\n")
|
||||||
|
|
||||||
|
ts.run(func(err error, client *smtpclient.Client) {
|
||||||
|
t.Helper()
|
||||||
|
tcheck(t, err, "init client")
|
||||||
|
mailFrom := "mjl@mox.example"
|
||||||
|
rcptTo := "mjl@mox.example"
|
||||||
|
err = client.Deliver(ctxbg, mailFrom, rcptTo, int64(len(extraMsg)), strings.NewReader(extraMsg), true, true, false)
|
||||||
|
var cerr smtpclient.Error
|
||||||
|
expErr := smtpclient.Error{Code: smtp.C554TransactionFailed, Secode: smtp.SeMsg6Other0}
|
||||||
|
if err == nil || !errors.As(err, &cerr) || cerr.Code != expErr.Code || cerr.Secode != expErr.Secode {
|
||||||
|
t.Fatalf("got err %#v, expected %#v", err, expErr)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
|
@ -485,8 +485,8 @@ type Message struct {
|
||||||
// filtering).
|
// filtering).
|
||||||
IsMailingList bool
|
IsMailingList bool
|
||||||
|
|
||||||
// If this message is a DSN. For DSNs, we don't look at the subject when matching
|
// If this message is a DSN, generated by us or received. For DSNs, we don't look
|
||||||
// threads.
|
// at the subject when matching threads.
|
||||||
DSN bool
|
DSN bool
|
||||||
|
|
||||||
ReceivedTLSVersion uint16 // 0 if unknown, 1 if plaintext/no TLS, otherwise TLS cipher suite.
|
ReceivedTLSVersion uint16 // 0 if unknown, 1 if plaintext/no TLS, otherwise TLS cipher suite.
|
||||||
|
@ -1265,7 +1265,8 @@ func (a *Account) HighestDeletedModSeq(tx *bstore.Tx) (ModSeq, error) {
|
||||||
return v.HighestDeletedModSeq, err
|
return v.HighestDeletedModSeq, err
|
||||||
}
|
}
|
||||||
|
|
||||||
// WithWLock runs fn with account writelock held. Necessary for account/mailbox modification. For message delivery, a read lock is required.
|
// WithWLock runs fn with account writelock held. Necessary for account/mailbox
|
||||||
|
// modification. For message delivery, a read lock is required.
|
||||||
func (a *Account) WithWLock(fn func()) {
|
func (a *Account) WithWLock(fn func()) {
|
||||||
a.Lock()
|
a.Lock()
|
||||||
defer a.Unlock()
|
defer a.Unlock()
|
||||||
|
@ -2224,6 +2225,32 @@ func (f Flags) Changed(other Flags) (mask Flags) {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Strings returns the flags that are set in their string form.
|
||||||
|
func (f Flags) Strings() []string {
|
||||||
|
fields := []struct {
|
||||||
|
word string
|
||||||
|
have bool
|
||||||
|
}{
|
||||||
|
{`$forwarded`, f.Forwarded},
|
||||||
|
{`$junk`, f.Junk},
|
||||||
|
{`$mdnsent`, f.MDNSent},
|
||||||
|
{`$notjunk`, f.Notjunk},
|
||||||
|
{`$phishing`, f.Phishing},
|
||||||
|
{`\answered`, f.Answered},
|
||||||
|
{`\deleted`, f.Deleted},
|
||||||
|
{`\draft`, f.Draft},
|
||||||
|
{`\flagged`, f.Flagged},
|
||||||
|
{`\seen`, f.Seen},
|
||||||
|
}
|
||||||
|
var l []string
|
||||||
|
for _, fh := range fields {
|
||||||
|
if fh.have {
|
||||||
|
l = append(l, fh.word)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return l
|
||||||
|
}
|
||||||
|
|
||||||
var systemWellKnownFlags = map[string]bool{
|
var systemWellKnownFlags = map[string]bool{
|
||||||
`\answered`: true,
|
`\answered`: true,
|
||||||
`\flagged`: true,
|
`\flagged`: true,
|
||||||
|
|
4
testdata/ctl/domains.conf
vendored
4
testdata/ctl/domains.conf
vendored
|
@ -2,6 +2,10 @@ Domains:
|
||||||
mox.example: nil
|
mox.example: nil
|
||||||
Accounts:
|
Accounts:
|
||||||
mjl:
|
mjl:
|
||||||
|
OutgoingWebhook:
|
||||||
|
URL: http://localhost:1234
|
||||||
|
KeepRetiredMessagePeriod: 1h0m0s
|
||||||
|
KeepRetiredWebhookPeriod: 1h0m0s
|
||||||
Domain: mox.example
|
Domain: mox.example
|
||||||
Destinations:
|
Destinations:
|
||||||
mjl@mox.example: nil
|
mjl@mox.example: nil
|
||||||
|
|
3
testdata/httpaccount/domains.conf
vendored
3
testdata/httpaccount/domains.conf
vendored
|
@ -1,5 +1,6 @@
|
||||||
Domains:
|
Domains:
|
||||||
mox.example: nil
|
mox.example:
|
||||||
|
LocalpartCatchallSeparator: +
|
||||||
Accounts:
|
Accounts:
|
||||||
mjl☺:
|
mjl☺:
|
||||||
Domain: mox.example
|
Domain: mox.example
|
||||||
|
|
25
testdata/queue/domains.conf
vendored
25
testdata/queue/domains.conf
vendored
|
@ -1,10 +1,33 @@
|
||||||
Domains:
|
Domains:
|
||||||
mox.example: nil
|
mox.example:
|
||||||
|
LocalpartCatchallSeparator: +
|
||||||
Accounts:
|
Accounts:
|
||||||
mjl:
|
mjl:
|
||||||
Domain: mox.example
|
Domain: mox.example
|
||||||
Destinations:
|
Destinations:
|
||||||
mjl@mox.example: nil
|
mjl@mox.example: nil
|
||||||
|
retired:
|
||||||
|
Domain: mox.example
|
||||||
|
Destinations:
|
||||||
|
retired@mox.example: nil
|
||||||
|
KeepRetiredMessagePeriod: 1ns
|
||||||
|
KeepRetiredWebhookPeriod: 1ns
|
||||||
|
OutgoingWebhook:
|
||||||
|
URL: http://localhost:1234/outgoing
|
||||||
|
Authorization: Basic dXNlcm5hbWU6cGFzc3dvcmQ=
|
||||||
|
IncomingWebhook:
|
||||||
|
URL: http://localhost:1234/incoming
|
||||||
|
Authorization: Basic dXNlcm5hbWU6cGFzc3dvcmQ=
|
||||||
|
hook:
|
||||||
|
Domain: mox.example
|
||||||
|
Destinations:
|
||||||
|
hook@mox.example: nil
|
||||||
|
OutgoingWebhook:
|
||||||
|
URL: http://localhost:1234/outgoing
|
||||||
|
Authorization: Basic dXNlcm5hbWU6cGFzc3dvcmQ=
|
||||||
|
IncomingWebhook:
|
||||||
|
URL: http://localhost:1234/incoming
|
||||||
|
Authorization: Basic dXNlcm5hbWU6cGFzc3dvcmQ=
|
||||||
|
|
||||||
Routes:
|
Routes:
|
||||||
-
|
-
|
||||||
|
|
32
testdata/webapisrv/domains.conf
vendored
Normal file
32
testdata/webapisrv/domains.conf
vendored
Normal file
|
@ -0,0 +1,32 @@
|
||||||
|
Domains:
|
||||||
|
mox.example:
|
||||||
|
LocalpartCatchallSeparator: +
|
||||||
|
DKIM:
|
||||||
|
Selectors:
|
||||||
|
testsel:
|
||||||
|
PrivateKeyFile: testsel.rsakey.pkcs8.pem
|
||||||
|
Sign:
|
||||||
|
- testsel
|
||||||
|
Accounts:
|
||||||
|
other:
|
||||||
|
Domain: mox.example
|
||||||
|
Destinations:
|
||||||
|
other@mox.example: nil
|
||||||
|
mjl:
|
||||||
|
MaxOutgoingMessagesPerDay: 30
|
||||||
|
MaxFirstTimeRecipientsPerDay: 10
|
||||||
|
Domain: mox.example
|
||||||
|
FromIDLoginAddresses:
|
||||||
|
- mjl+fromid@mox.example
|
||||||
|
Destinations:
|
||||||
|
mjl@mox.example: nil
|
||||||
|
møx@mox.example: nil
|
||||||
|
móx@mox.example: nil
|
||||||
|
RejectsMailbox: Rejects
|
||||||
|
JunkFilter:
|
||||||
|
Threshold: 0.95
|
||||||
|
Params:
|
||||||
|
Twograms: true
|
||||||
|
MaxPower: 0.1
|
||||||
|
TopWords: 10
|
||||||
|
IgnoreWords: 0.1
|
11
testdata/webapisrv/mox.conf
vendored
Normal file
11
testdata/webapisrv/mox.conf
vendored
Normal file
|
@ -0,0 +1,11 @@
|
||||||
|
DataDir: data
|
||||||
|
User: 1000
|
||||||
|
LogLevel: trace
|
||||||
|
Hostname: mox.example
|
||||||
|
Listeners:
|
||||||
|
local:
|
||||||
|
IPs:
|
||||||
|
- 0.0.0.0
|
||||||
|
Postmaster:
|
||||||
|
Account: mjl
|
||||||
|
Mailbox: postmaster
|
30
testdata/webapisrv/testsel.rsakey.pkcs8.pem
vendored
Normal file
30
testdata/webapisrv/testsel.rsakey.pkcs8.pem
vendored
Normal file
|
@ -0,0 +1,30 @@
|
||||||
|
-----BEGIN PRIVATE KEY-----
|
||||||
|
Note: RSA private key for use with DKIM, generated by mox
|
||||||
|
|
||||||
|
MIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQDdkh3fKzvRUWym
|
||||||
|
n9UwVrEw6s2Mc0+DTg04TWJKGKHXpvcTHuEcE6ALVS9MZKasyVsIHU7FNeS9/qNb
|
||||||
|
pLihhGdlhU3KAfrMpTBhiFpJoYiDXED98Of4iBxNHIuheLMxSBSClMbLGE2vAgha
|
||||||
|
/6LuONuzdMqk/c1TijBD+vGjCZI2qD58cgXWWKRK9e+WNhKNoVdedZ9iJtbtN0MI
|
||||||
|
UWk3iwHmjXf5qzS7i8vDoy86Ln0HW0vKl7UtwemLVv09/E23OdNN163eQvSlrEhx
|
||||||
|
a0odPQsM9SizxhiaI9rmcZtSqULt37hhPaNA+/AbELCzWijZPDqePVRqKGd5gYDK
|
||||||
|
8STLj0UHAgMBAAECggEBAKVkJJgplYUx2oCmXmSu0aVKIBTvHjNNV+DnIq9co7Ju
|
||||||
|
F5BWRILIw3ayJ5RGrYPc6e6ssdfT2uNX6GjIFGm8g9HsJ5zazXNk+zBSr9K2mUg0
|
||||||
|
3O6xnPaP41BMNo5ZoqjuvSCcHagMhDBWvBXxLJXWK2lRjNKMAXCSfmTANQ8WXeYd
|
||||||
|
XG2nYTPtBu6UgY8W6sKAx1xetxBrzk8q6JTxb5eVG22BSiUniWYif+XVmAj1u6TH
|
||||||
|
0m6X0Kb6zsMYYgKPC2hmDsxD3uZ7qBNxxJzzLjpK6eP9aeFKzNyfnaoO4s+9K6Di
|
||||||
|
31oxTBpqLI4dcrvg4xWl+YkEknXXaomMqM8hyDzfcAECgYEA9/zmjRpoTAoY3fu9
|
||||||
|
mn16wxReFXZZZhqV0+c+gyYtao2Kf2pUNAdhD62HQv7KtAPPHKvLfL8PH0u7bzK0
|
||||||
|
vVNzBUukwxGI7gsoTMdc3L5x4v9Yb6jUx7RrDZn93sDod/1f/sb56ARCFQoqbUck
|
||||||
|
dSjnVUyF/l5oeh6CgKhvtghJ/AcCgYEA5Lq4kL82qWjIuNUT/C3lzjPfQVU+WvQ9
|
||||||
|
wa+x4B4mxm5r4na3AU1T8H+peh4YstAJUgscGfYnLzxuMGuP1ReIuWYy29eDptKl
|
||||||
|
WTzVZDcZrAPciP1FOL6jm03PT2UAEuoPRr4OHLg8DxoOqG8pxqk1izDSHG2Tof6l
|
||||||
|
0ToafeIALwECgYEA8wvLTgnOpI/U1WNP7aUDd0Rz/WbzsW1m4Lsn+lOleWPllIE6
|
||||||
|
q4974mi5Q8ECG7IL/9aj5cw/XvXTauVwXIn4Ff2QKpr58AvBYJaX/cUtS0PlgfIf
|
||||||
|
MOczcK43MWUxscADoGmVLn9V4NcIw/dQ1P7U0zXfsXEHxoA2eTAb5HV1RWsCgYBd
|
||||||
|
TcXoVfgIV1Q6AcGrR1XNLd/OmOVc2PEwR2l6ERKkM3sS4HZ6s36gRpNt20Ub/D0x
|
||||||
|
GJMYDA+j9zTDz7zWokkFyCjLATkVHiyRIH2z6b4xK0oVH6vTIAFBYxZEPuEu1gfx
|
||||||
|
RaogEQ9+4ZRFJUOXZIMRCpNLQW/Nz0D4/oi7/SsyAQKBgHEA27Js8ivt+EFCBjwB
|
||||||
|
UbkW+LonDAXuUbw91lh5jICCigqUg73HNmV5xpoYI9JNPc6fy6wLyInVUC2w9tpO
|
||||||
|
eH2Rl8n79vQMLbzsFClGEC/Q1kAbK5bwUjlfvKBZjvE0RknWX9e1ZY04DSsunSrM
|
||||||
|
prS2eHVZ24hecd7j9XfAbHLC
|
||||||
|
-----END PRIVATE KEY-----
|
|
@ -589,7 +589,7 @@ Period: %s - %s UTC
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
|
|
||||||
qm := queue.MakeMsg(from.Path(), rcpt.Address.Path(), has8bit, smtputf8, msgSize, messageID, []byte(msgPrefix), nil, time.Now())
|
qm := queue.MakeMsg(from.Path(), rcpt.Address.Path(), has8bit, smtputf8, msgSize, messageID, []byte(msgPrefix), nil, time.Now(), subject)
|
||||||
// Don't try as long as regular deliveries, and stop before we would send the
|
// Don't try as long as regular deliveries, and stop before we would send the
|
||||||
// delayed DSN. Though we also won't send that due to IsTLSReport.
|
// delayed DSN. Though we also won't send that due to IsTLSReport.
|
||||||
// ../rfc/8460:1077
|
// ../rfc/8460:1077
|
||||||
|
@ -662,7 +662,7 @@ func composeMessage(ctx context.Context, log mlog.Log, mf *os.File, policyDomain
|
||||||
xc.Line()
|
xc.Line()
|
||||||
|
|
||||||
// Textual part, just mentioning this is a TLS report.
|
// Textual part, just mentioning this is a TLS report.
|
||||||
textBody, ct, cte := xc.TextPart(text)
|
textBody, ct, cte := xc.TextPart("plain", text)
|
||||||
textHdr := textproto.MIMEHeader{}
|
textHdr := textproto.MIMEHeader{}
|
||||||
textHdr.Set("Content-Type", ct)
|
textHdr.Set("Content-Type", ct)
|
||||||
textHdr.Set("Content-Transfer-Encoding", cte)
|
textHdr.Set("Content-Transfer-Encoding", cte)
|
||||||
|
|
|
@ -5,6 +5,7 @@ package webaccount
|
||||||
import (
|
import (
|
||||||
"archive/tar"
|
"archive/tar"
|
||||||
"archive/zip"
|
"archive/zip"
|
||||||
|
"bytes"
|
||||||
"compress/gzip"
|
"compress/gzip"
|
||||||
"context"
|
"context"
|
||||||
cryptorand "crypto/rand"
|
cryptorand "crypto/rand"
|
||||||
|
@ -15,9 +16,11 @@ import (
|
||||||
"io"
|
"io"
|
||||||
"log/slog"
|
"log/slog"
|
||||||
"net/http"
|
"net/http"
|
||||||
|
"net/url"
|
||||||
"os"
|
"os"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
"strings"
|
"strings"
|
||||||
|
"time"
|
||||||
|
|
||||||
_ "embed"
|
_ "embed"
|
||||||
|
|
||||||
|
@ -30,8 +33,12 @@ import (
|
||||||
"github.com/mjl-/mox/mlog"
|
"github.com/mjl-/mox/mlog"
|
||||||
"github.com/mjl-/mox/mox-"
|
"github.com/mjl-/mox/mox-"
|
||||||
"github.com/mjl-/mox/moxvar"
|
"github.com/mjl-/mox/moxvar"
|
||||||
|
"github.com/mjl-/mox/queue"
|
||||||
|
"github.com/mjl-/mox/smtp"
|
||||||
"github.com/mjl-/mox/store"
|
"github.com/mjl-/mox/store"
|
||||||
|
"github.com/mjl-/mox/webapi"
|
||||||
"github.com/mjl-/mox/webauth"
|
"github.com/mjl-/mox/webauth"
|
||||||
|
"github.com/mjl-/mox/webhook"
|
||||||
)
|
)
|
||||||
|
|
||||||
var pkglog = mlog.New("webaccount", nil)
|
var pkglog = mlog.New("webaccount", nil)
|
||||||
|
@ -414,7 +421,7 @@ func (Account) SetPassword(ctx context.Context, password string) {
|
||||||
// Account returns information about the account.
|
// Account returns information about the account.
|
||||||
// StorageUsed is the sum of the sizes of all messages, in bytes.
|
// StorageUsed is the sum of the sizes of all messages, in bytes.
|
||||||
// StorageLimit is the maximum storage that can be used, or 0 if there is no limit.
|
// StorageLimit is the maximum storage that can be used, or 0 if there is no limit.
|
||||||
func (Account) Account(ctx context.Context) (account config.Account, storageUsed, storageLimit int64) {
|
func (Account) Account(ctx context.Context) (account config.Account, storageUsed, storageLimit int64, suppressions []webapi.Suppression) {
|
||||||
log := pkglog.WithContext(ctx)
|
log := pkglog.WithContext(ctx)
|
||||||
reqInfo := ctx.Value(requestInfoCtxKey).(requestInfo)
|
reqInfo := ctx.Value(requestInfoCtxKey).(requestInfo)
|
||||||
|
|
||||||
|
@ -439,16 +446,19 @@ func (Account) Account(ctx context.Context) (account config.Account, storageUsed
|
||||||
xcheckf(ctx, err, "get disk usage")
|
xcheckf(ctx, err, "get disk usage")
|
||||||
})
|
})
|
||||||
|
|
||||||
return accConf, storageUsed, storageLimit
|
suppressions, err = queue.SuppressionList(ctx, reqInfo.AccountName)
|
||||||
|
xcheckf(ctx, err, "list suppressions")
|
||||||
|
|
||||||
|
return accConf, storageUsed, storageLimit, suppressions
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// AccountSaveFullName saves the full name (used as display name in email messages)
|
||||||
|
// for the account.
|
||||||
func (Account) AccountSaveFullName(ctx context.Context, fullName string) {
|
func (Account) AccountSaveFullName(ctx context.Context, fullName string) {
|
||||||
reqInfo := ctx.Value(requestInfoCtxKey).(requestInfo)
|
reqInfo := ctx.Value(requestInfoCtxKey).(requestInfo)
|
||||||
_, ok := mox.Conf.Account(reqInfo.AccountName)
|
err := mox.AccountSave(ctx, reqInfo.AccountName, func(acc *config.Account) {
|
||||||
if !ok {
|
acc.FullName = fullName
|
||||||
xcheckf(ctx, errors.New("not found"), "looking up account")
|
})
|
||||||
}
|
|
||||||
err := mox.AccountFullNameSave(ctx, reqInfo.AccountName, fullName)
|
|
||||||
xcheckf(ctx, err, "saving account full name")
|
xcheckf(ctx, err, "saving account full name")
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -457,25 +467,29 @@ func (Account) AccountSaveFullName(ctx context.Context, fullName string) {
|
||||||
// error is returned. Otherwise newDest is saved and the configuration reloaded.
|
// error is returned. Otherwise newDest is saved and the configuration reloaded.
|
||||||
func (Account) DestinationSave(ctx context.Context, destName string, oldDest, newDest config.Destination) {
|
func (Account) DestinationSave(ctx context.Context, destName string, oldDest, newDest config.Destination) {
|
||||||
reqInfo := ctx.Value(requestInfoCtxKey).(requestInfo)
|
reqInfo := ctx.Value(requestInfoCtxKey).(requestInfo)
|
||||||
accConf, ok := mox.Conf.Account(reqInfo.AccountName)
|
|
||||||
if !ok {
|
|
||||||
xcheckf(ctx, errors.New("not found"), "looking up account")
|
|
||||||
}
|
|
||||||
curDest, ok := accConf.Destinations[destName]
|
|
||||||
if !ok {
|
|
||||||
xcheckuserf(ctx, errors.New("not found"), "looking up destination")
|
|
||||||
}
|
|
||||||
|
|
||||||
if !curDest.Equal(oldDest) {
|
err := mox.AccountSave(ctx, reqInfo.AccountName, func(conf *config.Account) {
|
||||||
xcheckuserf(ctx, errors.New("modified"), "checking stored destination")
|
curDest, ok := conf.Destinations[destName]
|
||||||
}
|
if !ok {
|
||||||
|
xcheckuserf(ctx, errors.New("not found"), "looking up destination")
|
||||||
|
}
|
||||||
|
if !curDest.Equal(oldDest) {
|
||||||
|
xcheckuserf(ctx, errors.New("modified"), "checking stored destination")
|
||||||
|
}
|
||||||
|
|
||||||
// Keep fields we manage.
|
// Keep fields we manage.
|
||||||
newDest.DMARCReports = curDest.DMARCReports
|
newDest.DMARCReports = curDest.DMARCReports
|
||||||
newDest.HostTLSReports = curDest.HostTLSReports
|
newDest.HostTLSReports = curDest.HostTLSReports
|
||||||
newDest.DomainTLSReports = curDest.DomainTLSReports
|
newDest.DomainTLSReports = curDest.DomainTLSReports
|
||||||
|
|
||||||
err := mox.DestinationSave(ctx, reqInfo.AccountName, destName, newDest)
|
// Make copy of reference values.
|
||||||
|
nd := map[string]config.Destination{}
|
||||||
|
for dn, d := range conf.Destinations {
|
||||||
|
nd[dn] = d
|
||||||
|
}
|
||||||
|
nd[destName] = newDest
|
||||||
|
conf.Destinations = nd
|
||||||
|
})
|
||||||
xcheckf(ctx, err, "saving destination")
|
xcheckf(ctx, err, "saving destination")
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -491,3 +505,159 @@ func (Account) ImportAbort(ctx context.Context, importToken string) error {
|
||||||
func (Account) Types() (importProgress ImportProgress) {
|
func (Account) Types() (importProgress ImportProgress) {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// SuppressionList lists the addresses on the suppression list of this account.
|
||||||
|
func (Account) SuppressionList(ctx context.Context) (suppressions []webapi.Suppression) {
|
||||||
|
reqInfo := ctx.Value(requestInfoCtxKey).(requestInfo)
|
||||||
|
l, err := queue.SuppressionList(ctx, reqInfo.AccountName)
|
||||||
|
xcheckf(ctx, err, "list suppressions")
|
||||||
|
return l
|
||||||
|
}
|
||||||
|
|
||||||
|
// SuppressionAdd adds an email address to the suppression list.
|
||||||
|
func (Account) SuppressionAdd(ctx context.Context, address string, manual bool, reason string) (suppression webapi.Suppression) {
|
||||||
|
reqInfo := ctx.Value(requestInfoCtxKey).(requestInfo)
|
||||||
|
addr, err := smtp.ParseAddress(address)
|
||||||
|
xcheckuserf(ctx, err, "parsing address")
|
||||||
|
sup := webapi.Suppression{
|
||||||
|
Account: reqInfo.AccountName,
|
||||||
|
Manual: manual,
|
||||||
|
Reason: reason,
|
||||||
|
}
|
||||||
|
err = queue.SuppressionAdd(ctx, addr.Path(), &sup)
|
||||||
|
if err != nil && errors.Is(err, bstore.ErrUnique) {
|
||||||
|
xcheckuserf(ctx, err, "add suppression")
|
||||||
|
}
|
||||||
|
xcheckf(ctx, err, "add suppression")
|
||||||
|
return sup
|
||||||
|
}
|
||||||
|
|
||||||
|
// SuppressionRemove removes the email address from the suppression list.
|
||||||
|
func (Account) SuppressionRemove(ctx context.Context, address string) {
|
||||||
|
reqInfo := ctx.Value(requestInfoCtxKey).(requestInfo)
|
||||||
|
addr, err := smtp.ParseAddress(address)
|
||||||
|
xcheckuserf(ctx, err, "parsing address")
|
||||||
|
err = queue.SuppressionRemove(ctx, reqInfo.AccountName, addr.Path())
|
||||||
|
if err != nil && err == bstore.ErrAbsent {
|
||||||
|
xcheckuserf(ctx, err, "remove suppression")
|
||||||
|
}
|
||||||
|
xcheckf(ctx, err, "remove suppression")
|
||||||
|
}
|
||||||
|
|
||||||
|
// OutgoingWebhookSave saves a new webhook url for outgoing deliveries. If url
|
||||||
|
// is empty, the webhook is disabled. If authorization is non-empty it is used for
|
||||||
|
// the Authorization header in HTTP requests. Events specifies the outgoing events
|
||||||
|
// to be delivered, or all if empty/nil.
|
||||||
|
func (Account) OutgoingWebhookSave(ctx context.Context, url, authorization string, events []string) {
|
||||||
|
reqInfo := ctx.Value(requestInfoCtxKey).(requestInfo)
|
||||||
|
err := mox.AccountSave(ctx, reqInfo.AccountName, func(acc *config.Account) {
|
||||||
|
if url == "" {
|
||||||
|
acc.OutgoingWebhook = nil
|
||||||
|
} else {
|
||||||
|
acc.OutgoingWebhook = &config.OutgoingWebhook{URL: url, Authorization: authorization, Events: events}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
if err != nil && errors.Is(err, mox.ErrConfig) {
|
||||||
|
xcheckuserf(ctx, err, "saving account outgoing webhook")
|
||||||
|
}
|
||||||
|
xcheckf(ctx, err, "saving account outgoing webhook")
|
||||||
|
}
|
||||||
|
|
||||||
|
// OutgoingWebhookTest makes a test webhook call to urlStr, with optional
|
||||||
|
// authorization. If the HTTP request is made this call will succeed also for
|
||||||
|
// non-2xx HTTP status codes.
|
||||||
|
func (Account) OutgoingWebhookTest(ctx context.Context, urlStr, authorization string, data webhook.Outgoing) (code int, response string, errmsg string) {
|
||||||
|
log := pkglog.WithContext(ctx)
|
||||||
|
|
||||||
|
xvalidURL(ctx, urlStr)
|
||||||
|
log.Debug("making webhook test call for outgoing message", slog.String("url", urlStr))
|
||||||
|
|
||||||
|
var b bytes.Buffer
|
||||||
|
enc := json.NewEncoder(&b)
|
||||||
|
enc.SetIndent("", "\t")
|
||||||
|
enc.SetEscapeHTML(false)
|
||||||
|
err := enc.Encode(data)
|
||||||
|
xcheckf(ctx, err, "encoding outgoing webhook data")
|
||||||
|
|
||||||
|
code, response, err = queue.HookPost(ctx, log, 1, 1, urlStr, authorization, b.String())
|
||||||
|
if err != nil {
|
||||||
|
errmsg = err.Error()
|
||||||
|
}
|
||||||
|
log.Debugx("result for webhook test call for outgoing message", err, slog.Int("code", code), slog.String("response", response))
|
||||||
|
return code, response, errmsg
|
||||||
|
}
|
||||||
|
|
||||||
|
// IncomingWebhookSave saves a new webhook url for incoming deliveries. If url is
|
||||||
|
// empty, the webhook is disabled. If authorization is not empty, it is used in
|
||||||
|
// the Authorization header in requests.
|
||||||
|
func (Account) IncomingWebhookSave(ctx context.Context, url, authorization string) {
|
||||||
|
reqInfo := ctx.Value(requestInfoCtxKey).(requestInfo)
|
||||||
|
err := mox.AccountSave(ctx, reqInfo.AccountName, func(acc *config.Account) {
|
||||||
|
if url == "" {
|
||||||
|
acc.IncomingWebhook = nil
|
||||||
|
} else {
|
||||||
|
acc.IncomingWebhook = &config.IncomingWebhook{URL: url, Authorization: authorization}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
if err != nil && errors.Is(err, mox.ErrConfig) {
|
||||||
|
xcheckuserf(ctx, err, "saving account incoming webhook")
|
||||||
|
}
|
||||||
|
xcheckf(ctx, err, "saving account incoming webhook")
|
||||||
|
}
|
||||||
|
|
||||||
|
func xvalidURL(ctx context.Context, s string) {
|
||||||
|
u, err := url.Parse(s)
|
||||||
|
xcheckuserf(ctx, err, "parsing url")
|
||||||
|
if u.Scheme != "http" && u.Scheme != "https" {
|
||||||
|
xcheckuserf(ctx, errors.New("scheme must be http or https"), "parsing url")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// IncomingWebhookTest makes a test webhook HTTP delivery request to urlStr,
|
||||||
|
// with optional authorization header. If the HTTP call is made, this function
|
||||||
|
// returns non-error regardless of HTTP status code.
|
||||||
|
func (Account) IncomingWebhookTest(ctx context.Context, urlStr, authorization string, data webhook.Incoming) (code int, response string, errmsg string) {
|
||||||
|
log := pkglog.WithContext(ctx)
|
||||||
|
|
||||||
|
xvalidURL(ctx, urlStr)
|
||||||
|
log.Debug("making webhook test call for incoming message", slog.String("url", urlStr))
|
||||||
|
|
||||||
|
var b bytes.Buffer
|
||||||
|
enc := json.NewEncoder(&b)
|
||||||
|
enc.SetEscapeHTML(false)
|
||||||
|
enc.SetIndent("", "\t")
|
||||||
|
err := enc.Encode(data)
|
||||||
|
xcheckf(ctx, err, "encoding incoming webhook data")
|
||||||
|
code, response, err = queue.HookPost(ctx, log, 1, 1, urlStr, authorization, b.String())
|
||||||
|
if err != nil {
|
||||||
|
errmsg = err.Error()
|
||||||
|
}
|
||||||
|
log.Debugx("result for webhook test call for incoming message", err, slog.Int("code", code), slog.String("response", response))
|
||||||
|
return code, response, errmsg
|
||||||
|
}
|
||||||
|
|
||||||
|
// FromIDLoginAddressesSave saves new login addresses to enable unique SMTP
|
||||||
|
// MAIL FROM addresses ("fromid") for deliveries from the queue.
|
||||||
|
func (Account) FromIDLoginAddressesSave(ctx context.Context, loginAddresses []string) {
|
||||||
|
reqInfo := ctx.Value(requestInfoCtxKey).(requestInfo)
|
||||||
|
err := mox.AccountSave(ctx, reqInfo.AccountName, func(acc *config.Account) {
|
||||||
|
acc.FromIDLoginAddresses = loginAddresses
|
||||||
|
})
|
||||||
|
if err != nil && errors.Is(err, mox.ErrConfig) {
|
||||||
|
xcheckuserf(ctx, err, "saving account fromid login addresses")
|
||||||
|
}
|
||||||
|
xcheckf(ctx, err, "saving account fromid login addresses")
|
||||||
|
}
|
||||||
|
|
||||||
|
// KeepRetiredPeriodsSave save periods to save retired messages and webhooks.
|
||||||
|
func (Account) KeepRetiredPeriodsSave(ctx context.Context, keepRetiredMessagePeriod, keepRetiredWebhookPeriod time.Duration) {
|
||||||
|
reqInfo := ctx.Value(requestInfoCtxKey).(requestInfo)
|
||||||
|
err := mox.AccountSave(ctx, reqInfo.AccountName, func(acc *config.Account) {
|
||||||
|
acc.KeepRetiredMessagePeriod = keepRetiredMessagePeriod
|
||||||
|
acc.KeepRetiredWebhookPeriod = keepRetiredWebhookPeriod
|
||||||
|
})
|
||||||
|
if err != nil && errors.Is(err, mox.ErrConfig) {
|
||||||
|
xcheckuserf(ctx, err, "saving account keep retired periods")
|
||||||
|
}
|
||||||
|
xcheckf(ctx, err, "saving account keep retired periods")
|
||||||
|
}
|
||||||
|
|
|
@ -14,6 +14,7 @@ h2 { font-size: 1.1rem; }
|
||||||
h3, h4 { font-size: 1rem; }
|
h3, h4 { font-size: 1rem; }
|
||||||
ul { padding-left: 1rem; }
|
ul { padding-left: 1rem; }
|
||||||
.literal { background-color: #eee; padding: .5em 1em; border: 1px solid #eee; border-radius: 4px; white-space: pre-wrap; font-family: monospace; font-size: 15px; tab-size: 4; }
|
.literal { background-color: #eee; padding: .5em 1em; border: 1px solid #eee; border-radius: 4px; white-space: pre-wrap; font-family: monospace; font-size: 15px; tab-size: 4; }
|
||||||
|
table { border-spacing: 0; }
|
||||||
table td, table th { padding: .2em .5em; }
|
table td, table th { padding: .2em .5em; }
|
||||||
table > tbody > tr:nth-child(odd) { background-color: #f8f8f8; }
|
table > tbody > tr:nth-child(odd) { background-color: #f8f8f8; }
|
||||||
table.slim td, table.slim th { padding: 0; }
|
table.slim td, table.slim th { padding: 0; }
|
||||||
|
@ -23,8 +24,8 @@ p { margin-bottom: 1em; max-width: 50em; }
|
||||||
fieldset { border: 0; }
|
fieldset { border: 0; }
|
||||||
.scriptswitch { text-decoration: underline #dca053 2px; }
|
.scriptswitch { text-decoration: underline #dca053 2px; }
|
||||||
thead { position: sticky; top: 0; background-color: white; box-shadow: 0 1px 1px rgba(0, 0, 0, 0.1); }
|
thead { position: sticky; top: 0; background-color: white; box-shadow: 0 1px 1px rgba(0, 0, 0, 0.1); }
|
||||||
#page { opacity: 1; animation: fadein 0.15s ease-in; }
|
#page, .loadend { opacity: 1; animation: fadein 0.15s ease-in; }
|
||||||
#page.loading { opacity: 0.1; animation: fadeout 1s ease-out; }
|
#page.loading, .loadstart { opacity: 0.1; animation: fadeout 1s ease-out; }
|
||||||
@keyframes fadein { 0% { opacity: 0 } 100% { opacity: 1 } }
|
@keyframes fadein { 0% { opacity: 0 } 100% { opacity: 1 } }
|
||||||
@keyframes fadeout { 0% { opacity: 1 } 100% { opacity: 0.1 } }
|
@keyframes fadeout { 0% { opacity: 1 } 100% { opacity: 0.1 } }
|
||||||
.autosize { display: inline-grid; max-width: 90vw; }
|
.autosize { display: inline-grid; max-width: 90vw; }
|
||||||
|
|
|
@ -220,6 +220,7 @@ const [dom, style, attr, prop] = (function () {
|
||||||
autocomplete: (s) => _attr('autocomplete', s),
|
autocomplete: (s) => _attr('autocomplete', s),
|
||||||
list: (s) => _attr('list', s),
|
list: (s) => _attr('list', s),
|
||||||
form: (s) => _attr('form', s),
|
form: (s) => _attr('form', s),
|
||||||
|
size: (s) => _attr('size', s),
|
||||||
};
|
};
|
||||||
const style = (x) => { return { _styles: x }; };
|
const style = (x) => { return { _styles: x }; };
|
||||||
const prop = (x) => { return { _props: x }; };
|
const prop = (x) => { return { _props: x }; };
|
||||||
|
@ -228,11 +229,39 @@ const [dom, style, attr, prop] = (function () {
|
||||||
// NOTE: GENERATED by github.com/mjl-/sherpats, DO NOT MODIFY
|
// NOTE: GENERATED by github.com/mjl-/sherpats, DO NOT MODIFY
|
||||||
var api;
|
var api;
|
||||||
(function (api) {
|
(function (api) {
|
||||||
api.structTypes = { "Account": true, "AutomaticJunkFlags": true, "Destination": true, "Domain": true, "ImportProgress": true, "JunkFilter": true, "Route": true, "Ruleset": true, "SubjectPass": true };
|
// OutgoingEvent is an activity for an outgoing delivery. Either generated by the
|
||||||
api.stringsTypes = { "CSRFToken": true };
|
// queue, or through an incoming DSN (delivery status notification) message.
|
||||||
|
let OutgoingEvent;
|
||||||
|
(function (OutgoingEvent) {
|
||||||
|
// Message was accepted by a next-hop server. This does not necessarily mean the
|
||||||
|
// message has been delivered in the mailbox of the user.
|
||||||
|
OutgoingEvent["EventDelivered"] = "delivered";
|
||||||
|
// Outbound delivery was suppressed because the recipient address is on the
|
||||||
|
// suppression list of the account, or a simplified/base variant of the address is.
|
||||||
|
OutgoingEvent["EventSuppressed"] = "suppressed";
|
||||||
|
OutgoingEvent["EventDelayed"] = "delayed";
|
||||||
|
// Delivery of the message failed and will not be tried again. Also see the
|
||||||
|
// "Suppressing" field of [Outgoing].
|
||||||
|
OutgoingEvent["EventFailed"] = "failed";
|
||||||
|
// Message was relayed into a system that does not generate DSNs. Should only
|
||||||
|
// happen when explicitly requested.
|
||||||
|
OutgoingEvent["EventRelayed"] = "relayed";
|
||||||
|
// Message was accepted and is being delivered to multiple recipients (e.g. the
|
||||||
|
// address was an alias/list), which may generate more DSNs.
|
||||||
|
OutgoingEvent["EventExpanded"] = "expanded";
|
||||||
|
OutgoingEvent["EventCanceled"] = "canceled";
|
||||||
|
// An incoming message was received that was either a DSN with an unknown event
|
||||||
|
// type ("action"), or an incoming non-DSN-message was received for the unique
|
||||||
|
// per-outgoing-message address used for sending.
|
||||||
|
OutgoingEvent["EventUnrecognized"] = "unrecognized";
|
||||||
|
})(OutgoingEvent = api.OutgoingEvent || (api.OutgoingEvent = {}));
|
||||||
|
api.structTypes = { "Account": true, "AutomaticJunkFlags": true, "Destination": true, "Domain": true, "ImportProgress": true, "Incoming": true, "IncomingMeta": true, "IncomingWebhook": true, "JunkFilter": true, "NameAddress": true, "Outgoing": true, "OutgoingWebhook": true, "Route": true, "Ruleset": true, "Structure": true, "SubjectPass": true, "Suppression": true };
|
||||||
|
api.stringsTypes = { "CSRFToken": true, "OutgoingEvent": true };
|
||||||
api.intsTypes = {};
|
api.intsTypes = {};
|
||||||
api.types = {
|
api.types = {
|
||||||
"Account": { "Name": "Account", "Docs": "", "Fields": [{ "Name": "Domain", "Docs": "", "Typewords": ["string"] }, { "Name": "Description", "Docs": "", "Typewords": ["string"] }, { "Name": "FullName", "Docs": "", "Typewords": ["string"] }, { "Name": "Destinations", "Docs": "", "Typewords": ["{}", "Destination"] }, { "Name": "SubjectPass", "Docs": "", "Typewords": ["SubjectPass"] }, { "Name": "QuotaMessageSize", "Docs": "", "Typewords": ["int64"] }, { "Name": "RejectsMailbox", "Docs": "", "Typewords": ["string"] }, { "Name": "KeepRejects", "Docs": "", "Typewords": ["bool"] }, { "Name": "AutomaticJunkFlags", "Docs": "", "Typewords": ["AutomaticJunkFlags"] }, { "Name": "JunkFilter", "Docs": "", "Typewords": ["nullable", "JunkFilter"] }, { "Name": "MaxOutgoingMessagesPerDay", "Docs": "", "Typewords": ["int32"] }, { "Name": "MaxFirstTimeRecipientsPerDay", "Docs": "", "Typewords": ["int32"] }, { "Name": "NoFirstTimeSenderDelay", "Docs": "", "Typewords": ["bool"] }, { "Name": "Routes", "Docs": "", "Typewords": ["[]", "Route"] }, { "Name": "DNSDomain", "Docs": "", "Typewords": ["Domain"] }] },
|
"Account": { "Name": "Account", "Docs": "", "Fields": [{ "Name": "OutgoingWebhook", "Docs": "", "Typewords": ["nullable", "OutgoingWebhook"] }, { "Name": "IncomingWebhook", "Docs": "", "Typewords": ["nullable", "IncomingWebhook"] }, { "Name": "FromIDLoginAddresses", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "KeepRetiredMessagePeriod", "Docs": "", "Typewords": ["int64"] }, { "Name": "KeepRetiredWebhookPeriod", "Docs": "", "Typewords": ["int64"] }, { "Name": "Domain", "Docs": "", "Typewords": ["string"] }, { "Name": "Description", "Docs": "", "Typewords": ["string"] }, { "Name": "FullName", "Docs": "", "Typewords": ["string"] }, { "Name": "Destinations", "Docs": "", "Typewords": ["{}", "Destination"] }, { "Name": "SubjectPass", "Docs": "", "Typewords": ["SubjectPass"] }, { "Name": "QuotaMessageSize", "Docs": "", "Typewords": ["int64"] }, { "Name": "RejectsMailbox", "Docs": "", "Typewords": ["string"] }, { "Name": "KeepRejects", "Docs": "", "Typewords": ["bool"] }, { "Name": "AutomaticJunkFlags", "Docs": "", "Typewords": ["AutomaticJunkFlags"] }, { "Name": "JunkFilter", "Docs": "", "Typewords": ["nullable", "JunkFilter"] }, { "Name": "MaxOutgoingMessagesPerDay", "Docs": "", "Typewords": ["int32"] }, { "Name": "MaxFirstTimeRecipientsPerDay", "Docs": "", "Typewords": ["int32"] }, { "Name": "NoFirstTimeSenderDelay", "Docs": "", "Typewords": ["bool"] }, { "Name": "Routes", "Docs": "", "Typewords": ["[]", "Route"] }, { "Name": "DNSDomain", "Docs": "", "Typewords": ["Domain"] }] },
|
||||||
|
"OutgoingWebhook": { "Name": "OutgoingWebhook", "Docs": "", "Fields": [{ "Name": "URL", "Docs": "", "Typewords": ["string"] }, { "Name": "Authorization", "Docs": "", "Typewords": ["string"] }, { "Name": "Events", "Docs": "", "Typewords": ["[]", "string"] }] },
|
||||||
|
"IncomingWebhook": { "Name": "IncomingWebhook", "Docs": "", "Fields": [{ "Name": "URL", "Docs": "", "Typewords": ["string"] }, { "Name": "Authorization", "Docs": "", "Typewords": ["string"] }] },
|
||||||
"Destination": { "Name": "Destination", "Docs": "", "Fields": [{ "Name": "Mailbox", "Docs": "", "Typewords": ["string"] }, { "Name": "Rulesets", "Docs": "", "Typewords": ["[]", "Ruleset"] }, { "Name": "FullName", "Docs": "", "Typewords": ["string"] }] },
|
"Destination": { "Name": "Destination", "Docs": "", "Fields": [{ "Name": "Mailbox", "Docs": "", "Typewords": ["string"] }, { "Name": "Rulesets", "Docs": "", "Typewords": ["[]", "Ruleset"] }, { "Name": "FullName", "Docs": "", "Typewords": ["string"] }] },
|
||||||
"Ruleset": { "Name": "Ruleset", "Docs": "", "Fields": [{ "Name": "SMTPMailFromRegexp", "Docs": "", "Typewords": ["string"] }, { "Name": "VerifiedDomain", "Docs": "", "Typewords": ["string"] }, { "Name": "HeadersRegexp", "Docs": "", "Typewords": ["{}", "string"] }, { "Name": "IsForward", "Docs": "", "Typewords": ["bool"] }, { "Name": "ListAllowDomain", "Docs": "", "Typewords": ["string"] }, { "Name": "AcceptRejectsToMailbox", "Docs": "", "Typewords": ["string"] }, { "Name": "Mailbox", "Docs": "", "Typewords": ["string"] }, { "Name": "VerifiedDNSDomain", "Docs": "", "Typewords": ["Domain"] }, { "Name": "ListAllowDNSDomain", "Docs": "", "Typewords": ["Domain"] }] },
|
"Ruleset": { "Name": "Ruleset", "Docs": "", "Fields": [{ "Name": "SMTPMailFromRegexp", "Docs": "", "Typewords": ["string"] }, { "Name": "VerifiedDomain", "Docs": "", "Typewords": ["string"] }, { "Name": "HeadersRegexp", "Docs": "", "Typewords": ["{}", "string"] }, { "Name": "IsForward", "Docs": "", "Typewords": ["bool"] }, { "Name": "ListAllowDomain", "Docs": "", "Typewords": ["string"] }, { "Name": "AcceptRejectsToMailbox", "Docs": "", "Typewords": ["string"] }, { "Name": "Mailbox", "Docs": "", "Typewords": ["string"] }, { "Name": "VerifiedDNSDomain", "Docs": "", "Typewords": ["Domain"] }, { "Name": "ListAllowDNSDomain", "Docs": "", "Typewords": ["Domain"] }] },
|
||||||
"Domain": { "Name": "Domain", "Docs": "", "Fields": [{ "Name": "ASCII", "Docs": "", "Typewords": ["string"] }, { "Name": "Unicode", "Docs": "", "Typewords": ["string"] }] },
|
"Domain": { "Name": "Domain", "Docs": "", "Fields": [{ "Name": "ASCII", "Docs": "", "Typewords": ["string"] }, { "Name": "Unicode", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
@ -240,11 +269,20 @@ var api;
|
||||||
"AutomaticJunkFlags": { "Name": "AutomaticJunkFlags", "Docs": "", "Fields": [{ "Name": "Enabled", "Docs": "", "Typewords": ["bool"] }, { "Name": "JunkMailboxRegexp", "Docs": "", "Typewords": ["string"] }, { "Name": "NeutralMailboxRegexp", "Docs": "", "Typewords": ["string"] }, { "Name": "NotJunkMailboxRegexp", "Docs": "", "Typewords": ["string"] }] },
|
"AutomaticJunkFlags": { "Name": "AutomaticJunkFlags", "Docs": "", "Fields": [{ "Name": "Enabled", "Docs": "", "Typewords": ["bool"] }, { "Name": "JunkMailboxRegexp", "Docs": "", "Typewords": ["string"] }, { "Name": "NeutralMailboxRegexp", "Docs": "", "Typewords": ["string"] }, { "Name": "NotJunkMailboxRegexp", "Docs": "", "Typewords": ["string"] }] },
|
||||||
"JunkFilter": { "Name": "JunkFilter", "Docs": "", "Fields": [{ "Name": "Threshold", "Docs": "", "Typewords": ["float64"] }, { "Name": "Onegrams", "Docs": "", "Typewords": ["bool"] }, { "Name": "Twograms", "Docs": "", "Typewords": ["bool"] }, { "Name": "Threegrams", "Docs": "", "Typewords": ["bool"] }, { "Name": "MaxPower", "Docs": "", "Typewords": ["float64"] }, { "Name": "TopWords", "Docs": "", "Typewords": ["int32"] }, { "Name": "IgnoreWords", "Docs": "", "Typewords": ["float64"] }, { "Name": "RareWords", "Docs": "", "Typewords": ["int32"] }] },
|
"JunkFilter": { "Name": "JunkFilter", "Docs": "", "Fields": [{ "Name": "Threshold", "Docs": "", "Typewords": ["float64"] }, { "Name": "Onegrams", "Docs": "", "Typewords": ["bool"] }, { "Name": "Twograms", "Docs": "", "Typewords": ["bool"] }, { "Name": "Threegrams", "Docs": "", "Typewords": ["bool"] }, { "Name": "MaxPower", "Docs": "", "Typewords": ["float64"] }, { "Name": "TopWords", "Docs": "", "Typewords": ["int32"] }, { "Name": "IgnoreWords", "Docs": "", "Typewords": ["float64"] }, { "Name": "RareWords", "Docs": "", "Typewords": ["int32"] }] },
|
||||||
"Route": { "Name": "Route", "Docs": "", "Fields": [{ "Name": "FromDomain", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "ToDomain", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "MinimumAttempts", "Docs": "", "Typewords": ["int32"] }, { "Name": "Transport", "Docs": "", "Typewords": ["string"] }, { "Name": "FromDomainASCII", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "ToDomainASCII", "Docs": "", "Typewords": ["[]", "string"] }] },
|
"Route": { "Name": "Route", "Docs": "", "Fields": [{ "Name": "FromDomain", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "ToDomain", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "MinimumAttempts", "Docs": "", "Typewords": ["int32"] }, { "Name": "Transport", "Docs": "", "Typewords": ["string"] }, { "Name": "FromDomainASCII", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "ToDomainASCII", "Docs": "", "Typewords": ["[]", "string"] }] },
|
||||||
|
"Suppression": { "Name": "Suppression", "Docs": "", "Fields": [{ "Name": "ID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Created", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "Account", "Docs": "", "Typewords": ["string"] }, { "Name": "BaseAddress", "Docs": "", "Typewords": ["string"] }, { "Name": "OriginalAddress", "Docs": "", "Typewords": ["string"] }, { "Name": "Manual", "Docs": "", "Typewords": ["bool"] }, { "Name": "Reason", "Docs": "", "Typewords": ["string"] }] },
|
||||||
"ImportProgress": { "Name": "ImportProgress", "Docs": "", "Fields": [{ "Name": "Token", "Docs": "", "Typewords": ["string"] }] },
|
"ImportProgress": { "Name": "ImportProgress", "Docs": "", "Fields": [{ "Name": "Token", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"Outgoing": { "Name": "Outgoing", "Docs": "", "Fields": [{ "Name": "Version", "Docs": "", "Typewords": ["int32"] }, { "Name": "Event", "Docs": "", "Typewords": ["OutgoingEvent"] }, { "Name": "DSN", "Docs": "", "Typewords": ["bool"] }, { "Name": "Suppressing", "Docs": "", "Typewords": ["bool"] }, { "Name": "QueueMsgID", "Docs": "", "Typewords": ["int64"] }, { "Name": "FromID", "Docs": "", "Typewords": ["string"] }, { "Name": "MessageID", "Docs": "", "Typewords": ["string"] }, { "Name": "Subject", "Docs": "", "Typewords": ["string"] }, { "Name": "WebhookQueued", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "SMTPCode", "Docs": "", "Typewords": ["int32"] }, { "Name": "SMTPEnhancedCode", "Docs": "", "Typewords": ["string"] }, { "Name": "Error", "Docs": "", "Typewords": ["string"] }, { "Name": "Extra", "Docs": "", "Typewords": ["{}", "string"] }] },
|
||||||
|
"Incoming": { "Name": "Incoming", "Docs": "", "Fields": [{ "Name": "Version", "Docs": "", "Typewords": ["int32"] }, { "Name": "From", "Docs": "", "Typewords": ["[]", "NameAddress"] }, { "Name": "To", "Docs": "", "Typewords": ["[]", "NameAddress"] }, { "Name": "CC", "Docs": "", "Typewords": ["[]", "NameAddress"] }, { "Name": "BCC", "Docs": "", "Typewords": ["[]", "NameAddress"] }, { "Name": "ReplyTo", "Docs": "", "Typewords": ["[]", "NameAddress"] }, { "Name": "Subject", "Docs": "", "Typewords": ["string"] }, { "Name": "MessageID", "Docs": "", "Typewords": ["string"] }, { "Name": "InReplyTo", "Docs": "", "Typewords": ["string"] }, { "Name": "References", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Date", "Docs": "", "Typewords": ["nullable", "timestamp"] }, { "Name": "Text", "Docs": "", "Typewords": ["string"] }, { "Name": "HTML", "Docs": "", "Typewords": ["string"] }, { "Name": "Structure", "Docs": "", "Typewords": ["Structure"] }, { "Name": "Meta", "Docs": "", "Typewords": ["IncomingMeta"] }] },
|
||||||
|
"NameAddress": { "Name": "NameAddress", "Docs": "", "Fields": [{ "Name": "Name", "Docs": "", "Typewords": ["string"] }, { "Name": "Address", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"Structure": { "Name": "Structure", "Docs": "", "Fields": [{ "Name": "ContentType", "Docs": "", "Typewords": ["string"] }, { "Name": "ContentTypeParams", "Docs": "", "Typewords": ["{}", "string"] }, { "Name": "ContentID", "Docs": "", "Typewords": ["string"] }, { "Name": "DecodedSize", "Docs": "", "Typewords": ["int64"] }, { "Name": "Parts", "Docs": "", "Typewords": ["[]", "Structure"] }] },
|
||||||
|
"IncomingMeta": { "Name": "IncomingMeta", "Docs": "", "Fields": [{ "Name": "MsgID", "Docs": "", "Typewords": ["int64"] }, { "Name": "MailFrom", "Docs": "", "Typewords": ["string"] }, { "Name": "MailFromValidated", "Docs": "", "Typewords": ["bool"] }, { "Name": "MsgFromValidated", "Docs": "", "Typewords": ["bool"] }, { "Name": "RcptTo", "Docs": "", "Typewords": ["string"] }, { "Name": "DKIMVerifiedDomains", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "RemoteIP", "Docs": "", "Typewords": ["string"] }, { "Name": "Received", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "MailboxName", "Docs": "", "Typewords": ["string"] }, { "Name": "Automated", "Docs": "", "Typewords": ["bool"] }] },
|
||||||
"CSRFToken": { "Name": "CSRFToken", "Docs": "", "Values": null },
|
"CSRFToken": { "Name": "CSRFToken", "Docs": "", "Values": null },
|
||||||
|
"OutgoingEvent": { "Name": "OutgoingEvent", "Docs": "", "Values": [{ "Name": "EventDelivered", "Value": "delivered", "Docs": "" }, { "Name": "EventSuppressed", "Value": "suppressed", "Docs": "" }, { "Name": "EventDelayed", "Value": "delayed", "Docs": "" }, { "Name": "EventFailed", "Value": "failed", "Docs": "" }, { "Name": "EventRelayed", "Value": "relayed", "Docs": "" }, { "Name": "EventExpanded", "Value": "expanded", "Docs": "" }, { "Name": "EventCanceled", "Value": "canceled", "Docs": "" }, { "Name": "EventUnrecognized", "Value": "unrecognized", "Docs": "" }] },
|
||||||
};
|
};
|
||||||
api.parser = {
|
api.parser = {
|
||||||
Account: (v) => api.parse("Account", v),
|
Account: (v) => api.parse("Account", v),
|
||||||
|
OutgoingWebhook: (v) => api.parse("OutgoingWebhook", v),
|
||||||
|
IncomingWebhook: (v) => api.parse("IncomingWebhook", v),
|
||||||
Destination: (v) => api.parse("Destination", v),
|
Destination: (v) => api.parse("Destination", v),
|
||||||
Ruleset: (v) => api.parse("Ruleset", v),
|
Ruleset: (v) => api.parse("Ruleset", v),
|
||||||
Domain: (v) => api.parse("Domain", v),
|
Domain: (v) => api.parse("Domain", v),
|
||||||
|
@ -252,8 +290,15 @@ var api;
|
||||||
AutomaticJunkFlags: (v) => api.parse("AutomaticJunkFlags", v),
|
AutomaticJunkFlags: (v) => api.parse("AutomaticJunkFlags", v),
|
||||||
JunkFilter: (v) => api.parse("JunkFilter", v),
|
JunkFilter: (v) => api.parse("JunkFilter", v),
|
||||||
Route: (v) => api.parse("Route", v),
|
Route: (v) => api.parse("Route", v),
|
||||||
|
Suppression: (v) => api.parse("Suppression", v),
|
||||||
ImportProgress: (v) => api.parse("ImportProgress", v),
|
ImportProgress: (v) => api.parse("ImportProgress", v),
|
||||||
|
Outgoing: (v) => api.parse("Outgoing", v),
|
||||||
|
Incoming: (v) => api.parse("Incoming", v),
|
||||||
|
NameAddress: (v) => api.parse("NameAddress", v),
|
||||||
|
Structure: (v) => api.parse("Structure", v),
|
||||||
|
IncomingMeta: (v) => api.parse("IncomingMeta", v),
|
||||||
CSRFToken: (v) => api.parse("CSRFToken", v),
|
CSRFToken: (v) => api.parse("CSRFToken", v),
|
||||||
|
OutgoingEvent: (v) => api.parse("OutgoingEvent", v),
|
||||||
};
|
};
|
||||||
// Account exports web API functions for the account web interface. All its
|
// Account exports web API functions for the account web interface. All its
|
||||||
// methods are exported under api/. Function calls require valid HTTP
|
// methods are exported under api/. Function calls require valid HTTP
|
||||||
|
@ -322,10 +367,12 @@ var api;
|
||||||
async Account() {
|
async Account() {
|
||||||
const fn = "Account";
|
const fn = "Account";
|
||||||
const paramTypes = [];
|
const paramTypes = [];
|
||||||
const returnTypes = [["Account"], ["int64"], ["int64"]];
|
const returnTypes = [["Account"], ["int64"], ["int64"], ["[]", "Suppression"]];
|
||||||
const params = [];
|
const params = [];
|
||||||
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
}
|
}
|
||||||
|
// AccountSaveFullName saves the full name (used as display name in email messages)
|
||||||
|
// for the account.
|
||||||
async AccountSaveFullName(fullName) {
|
async AccountSaveFullName(fullName) {
|
||||||
const fn = "AccountSaveFullName";
|
const fn = "AccountSaveFullName";
|
||||||
const paramTypes = [["string"]];
|
const paramTypes = [["string"]];
|
||||||
|
@ -360,6 +407,88 @@ var api;
|
||||||
const params = [];
|
const params = [];
|
||||||
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
}
|
}
|
||||||
|
// SuppressionList lists the addresses on the suppression list of this account.
|
||||||
|
async SuppressionList() {
|
||||||
|
const fn = "SuppressionList";
|
||||||
|
const paramTypes = [];
|
||||||
|
const returnTypes = [["[]", "Suppression"]];
|
||||||
|
const params = [];
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// SuppressionAdd adds an email address to the suppression list.
|
||||||
|
async SuppressionAdd(address, manual, reason) {
|
||||||
|
const fn = "SuppressionAdd";
|
||||||
|
const paramTypes = [["string"], ["bool"], ["string"]];
|
||||||
|
const returnTypes = [["Suppression"]];
|
||||||
|
const params = [address, manual, reason];
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// SuppressionRemove removes the email address from the suppression list.
|
||||||
|
async SuppressionRemove(address) {
|
||||||
|
const fn = "SuppressionRemove";
|
||||||
|
const paramTypes = [["string"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [address];
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// OutgoingWebhookSave saves a new webhook url for outgoing deliveries. If url
|
||||||
|
// is empty, the webhook is disabled. If authorization is non-empty it is used for
|
||||||
|
// the Authorization header in HTTP requests. Events specifies the outgoing events
|
||||||
|
// to be delivered, or all if empty/nil.
|
||||||
|
async OutgoingWebhookSave(url, authorization, events) {
|
||||||
|
const fn = "OutgoingWebhookSave";
|
||||||
|
const paramTypes = [["string"], ["string"], ["[]", "string"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [url, authorization, events];
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// OutgoingWebhookTest makes a test webhook call to urlStr, with optional
|
||||||
|
// authorization. If the HTTP request is made this call will succeed also for
|
||||||
|
// non-2xx HTTP status codes.
|
||||||
|
async OutgoingWebhookTest(urlStr, authorization, data) {
|
||||||
|
const fn = "OutgoingWebhookTest";
|
||||||
|
const paramTypes = [["string"], ["string"], ["Outgoing"]];
|
||||||
|
const returnTypes = [["int32"], ["string"], ["string"]];
|
||||||
|
const params = [urlStr, authorization, data];
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// IncomingWebhookSave saves a new webhook url for incoming deliveries. If url is
|
||||||
|
// empty, the webhook is disabled. If authorization is not empty, it is used in
|
||||||
|
// the Authorization header in requests.
|
||||||
|
async IncomingWebhookSave(url, authorization) {
|
||||||
|
const fn = "IncomingWebhookSave";
|
||||||
|
const paramTypes = [["string"], ["string"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [url, authorization];
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// IncomingWebhookTest makes a test webhook HTTP delivery request to urlStr,
|
||||||
|
// with optional authorization header. If the HTTP call is made, this function
|
||||||
|
// returns non-error regardless of HTTP status code.
|
||||||
|
async IncomingWebhookTest(urlStr, authorization, data) {
|
||||||
|
const fn = "IncomingWebhookTest";
|
||||||
|
const paramTypes = [["string"], ["string"], ["Incoming"]];
|
||||||
|
const returnTypes = [["int32"], ["string"], ["string"]];
|
||||||
|
const params = [urlStr, authorization, data];
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// FromIDLoginAddressesSave saves new login addresses to enable unique SMTP
|
||||||
|
// MAIL FROM addresses ("fromid") for deliveries from the queue.
|
||||||
|
async FromIDLoginAddressesSave(loginAddresses) {
|
||||||
|
const fn = "FromIDLoginAddressesSave";
|
||||||
|
const paramTypes = [["[]", "string"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [loginAddresses];
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// KeepRetiredPeriodsSave save periods to save retired messages and webhooks.
|
||||||
|
async KeepRetiredPeriodsSave(keepRetiredMessagePeriod, keepRetiredWebhookPeriod) {
|
||||||
|
const fn = "KeepRetiredPeriodsSave";
|
||||||
|
const paramTypes = [["int64"], ["int64"]];
|
||||||
|
const returnTypes = [];
|
||||||
|
const params = [keepRetiredMessagePeriod, keepRetiredWebhookPeriod];
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
api.Client = Client;
|
api.Client = Client;
|
||||||
api.defaultBaseURL = (function () {
|
api.defaultBaseURL = (function () {
|
||||||
|
@ -753,6 +882,37 @@ const login = async (reason) => {
|
||||||
username.focus();
|
username.focus();
|
||||||
});
|
});
|
||||||
};
|
};
|
||||||
|
// Popup shows kids in a centered div with white background on top of a
|
||||||
|
// transparent overlay on top of the window. Clicking the overlay or hitting
|
||||||
|
// Escape closes the popup. Scrollbars are automatically added to the div with
|
||||||
|
// kids. Returns a function that removes the popup.
|
||||||
|
const popup = (...kids) => {
|
||||||
|
const origFocus = document.activeElement;
|
||||||
|
const close = () => {
|
||||||
|
if (!root.parentNode) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
root.remove();
|
||||||
|
if (origFocus && origFocus instanceof HTMLElement && origFocus.parentNode) {
|
||||||
|
origFocus.focus();
|
||||||
|
}
|
||||||
|
};
|
||||||
|
let content;
|
||||||
|
const root = dom.div(style({ position: 'fixed', top: 0, right: 0, bottom: 0, left: 0, backgroundColor: 'rgba(0, 0, 0, 0.1)', display: 'flex', alignItems: 'center', justifyContent: 'center', zIndex: '1' }), function keydown(e) {
|
||||||
|
if (e.key === 'Escape') {
|
||||||
|
e.stopPropagation();
|
||||||
|
close();
|
||||||
|
}
|
||||||
|
}, function click(e) {
|
||||||
|
e.stopPropagation();
|
||||||
|
close();
|
||||||
|
}, content = dom.div(attr.tabindex('0'), style({ backgroundColor: 'white', borderRadius: '.25em', padding: '1em', boxShadow: '0 0 20px rgba(0, 0, 0, 0.1)', border: '1px solid #ddd', maxWidth: '95vw', overflowX: 'auto', maxHeight: '95vh', overflowY: 'auto' }), function click(e) {
|
||||||
|
e.stopPropagation();
|
||||||
|
}, kids));
|
||||||
|
document.body.appendChild(root);
|
||||||
|
content.focus();
|
||||||
|
return close;
|
||||||
|
};
|
||||||
const localStorageGet = (k) => {
|
const localStorageGet = (k) => {
|
||||||
try {
|
try {
|
||||||
return window.localStorage.getItem(k);
|
return window.localStorage.getItem(k);
|
||||||
|
@ -842,6 +1002,39 @@ const green = '#1dea20';
|
||||||
const yellow = '#ffe400';
|
const yellow = '#ffe400';
|
||||||
const red = '#ff7443';
|
const red = '#ff7443';
|
||||||
const blue = '#8bc8ff';
|
const blue = '#8bc8ff';
|
||||||
|
const age = (date) => {
|
||||||
|
const r = dom.span(dom._class('notooltip'), attr.title(date.toString()));
|
||||||
|
const nowSecs = new Date().getTime() / 1000;
|
||||||
|
let t = nowSecs - date.getTime() / 1000;
|
||||||
|
let negative = '';
|
||||||
|
if (t < 0) {
|
||||||
|
negative = '-';
|
||||||
|
t = -t;
|
||||||
|
}
|
||||||
|
const minute = 60;
|
||||||
|
const hour = 60 * minute;
|
||||||
|
const day = 24 * hour;
|
||||||
|
const month = 30 * day;
|
||||||
|
const year = 365 * day;
|
||||||
|
const periods = [year, month, day, hour, minute];
|
||||||
|
const suffix = ['y', 'mo', 'd', 'h', 'min'];
|
||||||
|
let s;
|
||||||
|
for (let i = 0; i < periods.length; i++) {
|
||||||
|
const p = periods[i];
|
||||||
|
if (t >= 2 * p || i === periods.length - 1) {
|
||||||
|
const n = Math.round(t / p);
|
||||||
|
s = '' + n + suffix[i];
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (t < 60) {
|
||||||
|
s = '<1min';
|
||||||
|
// Prevent showing '-<1min' when browser and server have relatively small time drift of max 1 minute.
|
||||||
|
negative = '';
|
||||||
|
}
|
||||||
|
dom._kids(r, negative + s);
|
||||||
|
return r;
|
||||||
|
};
|
||||||
const formatQuotaSize = (v) => {
|
const formatQuotaSize = (v) => {
|
||||||
if (v === 0) {
|
if (v === 0) {
|
||||||
return '0';
|
return '0';
|
||||||
|
@ -861,7 +1054,7 @@ const formatQuotaSize = (v) => {
|
||||||
return '' + v;
|
return '' + v;
|
||||||
};
|
};
|
||||||
const index = async () => {
|
const index = async () => {
|
||||||
const [acc, storageUsed, storageLimit] = await client.Account();
|
const [acc, storageUsed, storageLimit, suppressions] = await client.Account();
|
||||||
let fullNameForm;
|
let fullNameForm;
|
||||||
let fullNameFieldset;
|
let fullNameFieldset;
|
||||||
let fullName;
|
let fullName;
|
||||||
|
@ -870,12 +1063,78 @@ const index = async () => {
|
||||||
let password1;
|
let password1;
|
||||||
let password2;
|
let password2;
|
||||||
let passwordHint;
|
let passwordHint;
|
||||||
|
let outgoingWebhookFieldset;
|
||||||
|
let outgoingWebhookURL;
|
||||||
|
let outgoingWebhookAuthorization;
|
||||||
|
let outgoingWebhookEvents;
|
||||||
|
let incomingWebhookFieldset;
|
||||||
|
let incomingWebhookURL;
|
||||||
|
let incomingWebhookAuthorization;
|
||||||
|
let keepRetiredPeriodsFieldset;
|
||||||
|
let keepRetiredMessagePeriod;
|
||||||
|
let keepRetiredWebhookPeriod;
|
||||||
|
let fromIDLoginAddressesFieldset;
|
||||||
|
const second = 1000 * 1000 * 1000;
|
||||||
|
const minute = 60 * second;
|
||||||
|
const hour = 60 * minute;
|
||||||
|
const day = 24 * hour;
|
||||||
|
const week = 7 * day;
|
||||||
|
const parseDuration = (s) => {
|
||||||
|
if (!s) {
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
const xparseint = () => {
|
||||||
|
const v = parseInt(s.substring(0, s.length - 1));
|
||||||
|
if (isNaN(v) || Math.round(v) !== v) {
|
||||||
|
throw new Error('bad number in duration');
|
||||||
|
}
|
||||||
|
return v;
|
||||||
|
};
|
||||||
|
if (s.endsWith('w')) {
|
||||||
|
return xparseint() * week;
|
||||||
|
}
|
||||||
|
if (s.endsWith('d')) {
|
||||||
|
return xparseint() * day;
|
||||||
|
}
|
||||||
|
if (s.endsWith('h')) {
|
||||||
|
return xparseint() * hour;
|
||||||
|
}
|
||||||
|
if (s.endsWith('m')) {
|
||||||
|
return xparseint() * minute;
|
||||||
|
}
|
||||||
|
if (s.endsWith('s')) {
|
||||||
|
return xparseint() * second;
|
||||||
|
}
|
||||||
|
throw new Error('bad duration ' + s);
|
||||||
|
};
|
||||||
|
const formatDuration = (v) => {
|
||||||
|
if (v === 0) {
|
||||||
|
return '';
|
||||||
|
}
|
||||||
|
const is = (period) => v > 0 && Math.round(v / period) === v / period;
|
||||||
|
const format = (period, s) => '' + (v / period) + s;
|
||||||
|
if (is(week)) {
|
||||||
|
return format(week, 'w');
|
||||||
|
}
|
||||||
|
if (is(day)) {
|
||||||
|
return format(day, 'd');
|
||||||
|
}
|
||||||
|
if (is(hour)) {
|
||||||
|
return format(hour, 'h');
|
||||||
|
}
|
||||||
|
if (is(minute)) {
|
||||||
|
return format(minute, 'm');
|
||||||
|
}
|
||||||
|
return format(second, 's');
|
||||||
|
};
|
||||||
let importForm;
|
let importForm;
|
||||||
let importFieldset;
|
let importFieldset;
|
||||||
let mailboxFileHint;
|
let mailboxFileHint;
|
||||||
let mailboxPrefixHint;
|
let mailboxPrefixHint;
|
||||||
let importProgress;
|
let importProgress;
|
||||||
let importAbortBox;
|
let importAbortBox;
|
||||||
|
let suppressionAddress;
|
||||||
|
let suppressionReason;
|
||||||
const importTrack = async (token) => {
|
const importTrack = async (token) => {
|
||||||
const importConnection = dom.div('Waiting for updates...');
|
const importConnection = dom.div('Waiting for updates...');
|
||||||
importProgress.appendChild(importConnection);
|
importProgress.appendChild(importConnection);
|
||||||
|
@ -952,6 +1211,139 @@ const index = async () => {
|
||||||
const exportForm = (filename) => {
|
const exportForm = (filename) => {
|
||||||
return dom.form(attr.target('_blank'), attr.method('POST'), attr.action('export/' + filename), dom.input(attr.type('hidden'), attr.name('csrf'), attr.value(localStorageGet('webaccountcsrftoken') || '')), dom.submitbutton('Export'));
|
return dom.form(attr.target('_blank'), attr.method('POST'), attr.action('export/' + filename), dom.input(attr.type('hidden'), attr.name('csrf'), attr.value(localStorageGet('webaccountcsrftoken') || '')), dom.submitbutton('Export'));
|
||||||
};
|
};
|
||||||
|
const authorizationPopup = (dest) => {
|
||||||
|
let username;
|
||||||
|
let password;
|
||||||
|
const close = popup(dom.form(function submit(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
e.stopPropagation();
|
||||||
|
dest.value = 'Basic ' + window.btoa(username.value + ':' + password.value);
|
||||||
|
close();
|
||||||
|
}, dom.p('Compose HTTP Basic authentication header'), dom.div(style({ marginBottom: '1ex' }), dom.div(dom.label('Username')), username = dom.input(attr.required(''))), dom.div(style({ marginBottom: '1ex' }), dom.div(dom.label('Password (shown in clear)')), password = dom.input(attr.required(''))), dom.div(style({ marginBottom: '1ex' }), dom.submitbutton('Set')), dom.div('A HTTP Basic authorization header contains the password in plain text, as base64.')));
|
||||||
|
username.focus();
|
||||||
|
};
|
||||||
|
const popupTestOutgoing = () => {
|
||||||
|
let fieldset;
|
||||||
|
let event;
|
||||||
|
let dsn;
|
||||||
|
let suppressing;
|
||||||
|
let queueMsgID;
|
||||||
|
let fromID;
|
||||||
|
let messageID;
|
||||||
|
let error;
|
||||||
|
let extra;
|
||||||
|
let body;
|
||||||
|
let curl;
|
||||||
|
let result;
|
||||||
|
let data = {
|
||||||
|
Version: 0,
|
||||||
|
Event: api.OutgoingEvent.EventDelivered,
|
||||||
|
DSN: false,
|
||||||
|
Suppressing: false,
|
||||||
|
QueueMsgID: 123,
|
||||||
|
FromID: 'MDEyMzQ1Njc4OWFiY2RlZg',
|
||||||
|
MessageID: '<QnxzgulZK51utga6agH_rg@mox.example>',
|
||||||
|
Subject: 'test from mox web pages',
|
||||||
|
WebhookQueued: new Date(),
|
||||||
|
SMTPCode: 0,
|
||||||
|
SMTPEnhancedCode: '',
|
||||||
|
Error: '',
|
||||||
|
Extra: {},
|
||||||
|
};
|
||||||
|
const onchange = function change() {
|
||||||
|
data = {
|
||||||
|
Version: 0,
|
||||||
|
Event: event.value,
|
||||||
|
DSN: dsn.checked,
|
||||||
|
Suppressing: suppressing.checked,
|
||||||
|
QueueMsgID: parseInt(queueMsgID.value),
|
||||||
|
FromID: fromID.value,
|
||||||
|
MessageID: messageID.value,
|
||||||
|
Subject: 'test from mox web pages',
|
||||||
|
WebhookQueued: new Date(),
|
||||||
|
SMTPCode: 0,
|
||||||
|
SMTPEnhancedCode: '',
|
||||||
|
Error: error.value,
|
||||||
|
Extra: JSON.parse(extra.value),
|
||||||
|
};
|
||||||
|
const curlStr = "curl " + (outgoingWebhookAuthorization.value ? "-H 'Authorization: " + outgoingWebhookAuthorization.value + "' " : "") + "-H 'X-Mox-Webhook-ID: 1' -H 'X-Mox-Webhook-Attempt: 1' --json '" + JSON.stringify(data) + "' '" + outgoingWebhookURL.value + "'";
|
||||||
|
dom._kids(curl, style({ maxWidth: '45em', wordBreak: 'break-all' }), curlStr);
|
||||||
|
body.value = JSON.stringify(data, undefined, "\t");
|
||||||
|
};
|
||||||
|
popup(dom.h1('Test webhook for outgoing delivery'), dom.form(async function submit(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
e.stopPropagation();
|
||||||
|
result.classList.add('loadstart');
|
||||||
|
const [code, response, errmsg] = await check(fieldset, client.OutgoingWebhookTest(outgoingWebhookURL.value, outgoingWebhookAuthorization.value, data));
|
||||||
|
const nresult = dom.div(dom._class('loadend'), dom.table(dom.tr(dom.td('HTTP status code'), dom.td('' + code)), dom.tr(dom.td('Error message'), dom.td(errmsg)), dom.tr(dom.td('Response'), dom.td(response))));
|
||||||
|
result.replaceWith(nresult);
|
||||||
|
result = nresult;
|
||||||
|
}, fieldset = dom.fieldset(dom.p('Make a test call to ', dom.b(outgoingWebhookURL.value), '.'), dom.div(style({ display: 'flex', gap: '1em' }), dom.div(dom.h2('Parameters'), dom.div(style({ marginBottom: '.5ex' }), dom.label('Event', dom.div(event = dom.select(onchange, ["delivered", "suppressed", "delayed", "failed", "relayed", "expanded", "canceled", "unrecognized"].map(s => dom.option(s.substring(0, 1).toUpperCase() + s.substring(1), attr.value(s))))))), dom.div(style({ marginBottom: '.5ex' }), dom.label(dsn = dom.input(attr.type('checkbox')), ' DSN', onchange)), dom.div(style({ marginBottom: '.5ex' }), dom.label(suppressing = dom.input(attr.type('checkbox')), ' Suppressing', onchange)), dom.div(style({ marginBottom: '.5ex' }), dom.label('Queue message ID ', dom.div(queueMsgID = dom.input(attr.required(''), attr.type('number'), attr.value('123'), onchange)))), dom.div(style({ marginBottom: '.5ex' }), dom.label('From ID ', dom.div(fromID = dom.input(attr.required(''), attr.value(data.FromID), onchange)))), dom.div(style({ marginBottom: '.5ex' }), dom.label('MessageID', dom.div(messageID = dom.input(attr.required(''), attr.value(data.MessageID), onchange)))), dom.div(style({ marginBottom: '.5ex' }), dom.label('Error', dom.div(error = dom.input(onchange)))), dom.div(style({ marginBottom: '.5ex' }), dom.label('Extra', dom.div(extra = dom.input(attr.required(''), attr.value('{}'), onchange))))), dom.div(dom.h2('Headers'), dom.pre('X-Mox-Webhook-ID: 1\nX-Mox-Webhook-Attempt: 1'), dom.br(), dom.h2('JSON'), body = dom.textarea(attr.disabled(''), attr.rows('15'), style({ width: '30em' })), dom.br(), dom.h2('curl'), curl = dom.div(dom._class('literal')))), dom.br(), dom.div(style({ textAlign: 'right' }), dom.submitbutton('Post')), dom.br(), result = dom.div())));
|
||||||
|
onchange();
|
||||||
|
};
|
||||||
|
const popupTestIncoming = () => {
|
||||||
|
let fieldset;
|
||||||
|
let body;
|
||||||
|
let curl;
|
||||||
|
let result;
|
||||||
|
let data = {
|
||||||
|
Version: 0,
|
||||||
|
From: [{ Name: 'remote', Address: 'remote@remote.example' }],
|
||||||
|
To: [{ Name: 'mox', Address: 'mox@mox.example' }],
|
||||||
|
CC: [],
|
||||||
|
BCC: [],
|
||||||
|
ReplyTo: [],
|
||||||
|
Subject: 'test webhook for incoming message',
|
||||||
|
MessageID: '<QnxzgulZK51utga6agH_rg@mox.example>',
|
||||||
|
InReplyTo: '',
|
||||||
|
References: [],
|
||||||
|
Date: new Date(),
|
||||||
|
Text: 'hi ☺\n',
|
||||||
|
HTML: '',
|
||||||
|
Structure: {
|
||||||
|
ContentType: 'text/plain',
|
||||||
|
ContentTypeParams: { charset: 'utf-8' },
|
||||||
|
ContentID: '',
|
||||||
|
DecodedSize: 8,
|
||||||
|
Parts: [],
|
||||||
|
},
|
||||||
|
Meta: {
|
||||||
|
MsgID: 1,
|
||||||
|
MailFrom: 'remote@remote.example',
|
||||||
|
MailFromValidated: true,
|
||||||
|
MsgFromValidated: true,
|
||||||
|
RcptTo: 'mox@localhost',
|
||||||
|
DKIMVerifiedDomains: ['remote.example'],
|
||||||
|
RemoteIP: '127.0.0.1',
|
||||||
|
Received: new Date(),
|
||||||
|
MailboxName: 'Inbox',
|
||||||
|
Automated: false,
|
||||||
|
},
|
||||||
|
};
|
||||||
|
const onchange = function change() {
|
||||||
|
try {
|
||||||
|
api.parser.Incoming(JSON.parse(body.value));
|
||||||
|
}
|
||||||
|
catch (err) {
|
||||||
|
console.log({ err });
|
||||||
|
window.alert('Error parsing data: ' + errmsg(err));
|
||||||
|
}
|
||||||
|
const curlStr = "curl " + (incomingWebhookAuthorization.value ? "-H 'Authorization: " + incomingWebhookAuthorization.value + "' " : "") + "-H 'X-Mox-Webhook-ID: 1' -H 'X-Mox-Webhook-Attempt: 1' --json '" + JSON.stringify(data) + "' '" + incomingWebhookURL.value + "'";
|
||||||
|
dom._kids(curl, style({ maxWidth: '45em', wordBreak: 'break-all' }), curlStr);
|
||||||
|
};
|
||||||
|
popup(dom.h1('Test webhook for incoming delivery'), dom.form(async function submit(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
e.stopPropagation();
|
||||||
|
result.classList.add('loadstart');
|
||||||
|
const [code, response, errmsg] = await check(fieldset, (async () => await client.IncomingWebhookTest(incomingWebhookURL.value, incomingWebhookAuthorization.value, api.parser.Incoming(JSON.parse(body.value))))());
|
||||||
|
const nresult = dom.div(dom._class('loadend'), dom.table(dom.tr(dom.td('HTTP status code'), dom.td('' + code)), dom.tr(dom.td('Error message'), dom.td(errmsg)), dom.tr(dom.td('Response'), dom.td(response))));
|
||||||
|
result.replaceWith(nresult);
|
||||||
|
result = nresult;
|
||||||
|
}, fieldset = dom.fieldset(dom.p('Make a test call to ', dom.b(incomingWebhookURL.value), '.'), dom.div(style({ display: 'flex', gap: '1em' }), dom.div(dom.h2('JSON'), body = dom.textarea(style({ maxHeight: '90vh' }), style({ width: '30em' }), onchange)), dom.div(dom.h2('Headers'), dom.pre('X-Mox-Webhook-ID: 1\nX-Mox-Webhook-Attempt: 1'), dom.br(), dom.h2('curl'), curl = dom.div(dom._class('literal')))), dom.br(), dom.div(style({ textAlign: 'right' }), dom.submitbutton('Post')), dom.br(), result = dom.div())));
|
||||||
|
body.value = JSON.stringify(data, undefined, '\t');
|
||||||
|
body.setAttribute('rows', '' + Math.min(40, (body.value.split('\n').length + 1)));
|
||||||
|
onchange();
|
||||||
|
};
|
||||||
dom._kids(page, crumbs('Mox Account'), dom.p('NOTE: Not all account settings can be configured through these pages yet. See the configuration file for more options.'), dom.div('Default domain: ', acc.DNSDomain.ASCII ? domainString(acc.DNSDomain) : '(none)'), dom.br(), fullNameForm = dom.form(fullNameFieldset = dom.fieldset(dom.label(style({ display: 'inline-block' }), 'Full name', dom.br(), fullName = dom.input(attr.value(acc.FullName), attr.title('Name to use in From header when composing messages. Can be overridden per configured address.'))), ' ', dom.submitbutton('Save')), async function submit(e) {
|
dom._kids(page, crumbs('Mox Account'), dom.p('NOTE: Not all account settings can be configured through these pages yet. See the configuration file for more options.'), dom.div('Default domain: ', acc.DNSDomain.ASCII ? domainString(acc.DNSDomain) : '(none)'), dom.br(), fullNameForm = dom.form(fullNameFieldset = dom.fieldset(dom.label(style({ display: 'inline-block' }), 'Full name', dom.br(), fullName = dom.input(attr.value(acc.FullName), attr.title('Name to use in From header when composing messages. Can be overridden per configured address.'))), ' ', dom.submitbutton('Save')), async function submit(e) {
|
||||||
e.preventDefault();
|
e.preventDefault();
|
||||||
await check(fullNameFieldset, client.AccountSaveFullName(fullName.value));
|
await check(fullNameFieldset, client.AccountSaveFullName(fullName.value));
|
||||||
|
@ -989,7 +1381,67 @@ const index = async () => {
|
||||||
' (',
|
' (',
|
||||||
'' + Math.floor(100 * storageUsed / storageLimit),
|
'' + Math.floor(100 * storageUsed / storageLimit),
|
||||||
'%).',
|
'%).',
|
||||||
] : [', no explicit limit is configured.']), dom.h2('Export'), dom.p('Export all messages in all mailboxes. In maildir or mbox format, as .zip or .tgz file.'), dom.table(dom._class('slim'), dom.tr(dom.td('Maildirs in .tgz'), dom.td(exportForm('mail-export-maildir.tgz'))), dom.tr(dom.td('Maildirs in .zip'), dom.td(exportForm('mail-export-maildir.zip'))), dom.tr(dom.td('Mbox files in .tgz'), dom.td(exportForm('mail-export-mbox.tgz'))), dom.tr(dom.td('Mbox files in .zip'), dom.td(exportForm('mail-export-mbox.zip')))), dom.br(), dom.h2('Import'), dom.p('Import messages from a .zip or .tgz file with maildirs and/or mbox files.'), importForm = dom.form(async function submit(e) {
|
] : [', no explicit limit is configured.']), dom.h2('Webhooks'), dom.h3('Outgoing', attr.title('Webhooks for outgoing messages are called for each attempt to deliver a message in the outgoing queue, e.g. when the queue has delivered a message to the next hop, when a single attempt failed with a temporary error, when delivery permanently failed, or when DSN (delivery status notification) messages were received about a previously sent message.')), dom.form(async function submit(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
e.stopPropagation();
|
||||||
|
await check(outgoingWebhookFieldset, client.OutgoingWebhookSave(outgoingWebhookURL.value, outgoingWebhookAuthorization.value, [...outgoingWebhookEvents.selectedOptions].map(o => o.value)));
|
||||||
|
}, outgoingWebhookFieldset = dom.fieldset(dom.div(style({ display: 'flex', gap: '1em' }), dom.div(dom.label(dom.div('URL', attr.title('URL to do an HTTP POST to for each event. Webhooks are disabled if empty.')), outgoingWebhookURL = dom.input(attr.value(acc.OutgoingWebhook?.URL || ''), style({ width: '30em' })))), dom.div(dom.label(dom.div('Authorization header ', dom.a('Basic', attr.href(''), function click(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
authorizationPopup(outgoingWebhookAuthorization);
|
||||||
|
}), attr.title('If non-empty, HTTP requests have this value as Authorization header, e.g. Basic <base64-encoded-username-password>.')), outgoingWebhookAuthorization = dom.input(attr.value(acc.OutgoingWebhook?.Authorization || '')))), dom.div(dom.label(style({ verticalAlign: 'top' }), dom.div('Events', attr.title('Either limit to specific events, or receive all events (default).')), outgoingWebhookEvents = dom.select(style({ verticalAlign: 'bottom' }), attr.multiple(''), attr.size('8'), // Number of options.
|
||||||
|
["delivered", "suppressed", "delayed", "failed", "relayed", "expanded", "canceled", "unrecognized"].map(s => dom.option(s.substring(0, 1).toUpperCase() + s.substring(1), attr.value(s), acc.OutgoingWebhook?.Events?.includes(s) ? attr.selected('') : []))))), dom.div(dom.div(dom.label('\u00a0')), dom.submitbutton('Save'), ' ', dom.clickbutton('Test', function click() {
|
||||||
|
popupTestOutgoing();
|
||||||
|
}))))), dom.br(), dom.h3('Incoming', attr.title('Webhooks for incoming messages are called for each message received over SMTP, excluding DSN messages about previous deliveries.')), dom.form(async function submit(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
e.stopPropagation();
|
||||||
|
await check(incomingWebhookFieldset, client.IncomingWebhookSave(incomingWebhookURL.value, incomingWebhookAuthorization.value));
|
||||||
|
}, incomingWebhookFieldset = dom.fieldset(dom.div(style({ display: 'flex', gap: '1em' }), dom.div(dom.label(dom.div('URL'), incomingWebhookURL = dom.input(attr.value(acc.IncomingWebhook?.URL || ''), style({ width: '30em' })))), dom.div(dom.label(dom.div('Authorization header ', dom.a('Basic', attr.href(''), function click(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
authorizationPopup(incomingWebhookAuthorization);
|
||||||
|
}), attr.title('If non-empty, HTTP requests have this value as Authorization header, e.g. Basic <base64-encoded-username-password>.')), incomingWebhookAuthorization = dom.input(attr.value(acc.IncomingWebhook?.Authorization || '')))), dom.div(dom.div(dom.label('\u00a0')), dom.submitbutton('Save'), ' ', dom.clickbutton('Test', function click() {
|
||||||
|
popupTestIncoming();
|
||||||
|
}))))), dom.br(), dom.h2('Keep messages/webhooks retired from queue', attr.title('After delivering a message or webhook from the queue it is removed by default. But you can also keep these "retired" messages/webhooks around for a while. With unique SMTP MAIL FROM addresses configured below, this allows relating incoming delivery status notification messages (DSNs) to previously sent messages and their original recipients, which is needed for automatic management of recipient suppression lists, which is important for managing the reputation of your mail server. For both messages and webhooks, this can be useful for debugging. Use values like "3d" for 3 days, or units "s" for second, "m" for minute, "h" for hour, "w" for week.')), dom.form(async function submit(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
e.stopPropagation();
|
||||||
|
await check(keepRetiredPeriodsFieldset, (async () => await client.KeepRetiredPeriodsSave(parseDuration(keepRetiredMessagePeriod.value), parseDuration(keepRetiredWebhookPeriod.value)))());
|
||||||
|
}, keepRetiredPeriodsFieldset = dom.fieldset(dom.div(style({ display: 'flex', gap: '1em', alignItems: 'flex-end' }), dom.div(dom.label('Messages deliveries', dom.br(), keepRetiredMessagePeriod = dom.input(attr.value(formatDuration(acc.KeepRetiredMessagePeriod))))), dom.div(dom.label('Webhook deliveries', dom.br(), keepRetiredWebhookPeriod = dom.input(attr.value(formatDuration(acc.KeepRetiredWebhookPeriod))))), dom.div(dom.submitbutton('Save'))))), dom.br(), dom.h2('Unique SMTP MAIL FROM login addresses', attr.title('Outgoing messages are normally sent using your email address in the SMTP MAIL FROM command. By using unique addresses (by using the localpart catchall separator, e.g. addresses of the form "localpart+<uniquefromid>@domain"), future incoming DSNs can be related to the original outgoing messages and recipients, which allows for automatic management of recipient suppression lists when keeping retired messages for as long as you expect DSNs to come in as configured above. Configure the addresses used for logging in with SMTP submission, the webapi or webmail for which unique SMTP MAIL FROM addesses should be enabled. Note: These are addresses used for authenticating, not the address in the message "From" header.')), (() => {
|
||||||
|
let inputs = [];
|
||||||
|
let elem;
|
||||||
|
const render = () => {
|
||||||
|
inputs = [];
|
||||||
|
const e = dom.form(async function submit(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
e.stopPropagation();
|
||||||
|
await check(fromIDLoginAddressesFieldset, client.FromIDLoginAddressesSave(inputs.map(e => e.value)));
|
||||||
|
}, fromIDLoginAddressesFieldset = dom.fieldset(dom.table(dom.tbody((acc.FromIDLoginAddresses || []).length === 0 ? dom.tr(dom.td('(None)'), dom.td()) : [], (acc.FromIDLoginAddresses || []).map((s, index) => {
|
||||||
|
const input = dom.input(attr.required(''), attr.value(s));
|
||||||
|
inputs.push(input);
|
||||||
|
const x = dom.tr(dom.td(input), dom.td(dom.clickbutton('Remove', function click() {
|
||||||
|
acc.FromIDLoginAddresses.splice(index, 1);
|
||||||
|
render();
|
||||||
|
})));
|
||||||
|
return x;
|
||||||
|
})), dom.tfoot(dom.tr(dom.td(), dom.td(dom.clickbutton('Add', function click() {
|
||||||
|
acc.FromIDLoginAddresses = (acc.FromIDLoginAddresses || []).concat(['']);
|
||||||
|
render();
|
||||||
|
}))), dom.tr(dom.td(attr.colspan('2'), dom.submitbutton('Save')))))));
|
||||||
|
if (elem) {
|
||||||
|
elem.replaceWith(e);
|
||||||
|
elem = e;
|
||||||
|
}
|
||||||
|
return e;
|
||||||
|
};
|
||||||
|
elem = render();
|
||||||
|
return elem;
|
||||||
|
})(), dom.br(), dom.h2('Suppression list'), dom.p('Messages queued for delivery to recipients on the suppression list will immediately fail. If delivery to a recipient fails repeatedly, it can be added to the suppression list automatically. Repeated rejected delivery attempts can have a negative influence of mail server reputation. Applications sending email can implement their own handling of delivery failure notifications, but not all do.'), dom.form(attr.id('suppressionAdd'), async function submit(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
e.stopPropagation();
|
||||||
|
await check(e.target, client.SuppressionAdd(suppressionAddress.value, true, suppressionReason.value));
|
||||||
|
window.location.reload(); // todo: reload less
|
||||||
|
}), dom.table(dom.thead(dom.tr(dom.th('Address', attr.title('Address that caused this entry to be added to the list. The title (shown on hover) displays an address with a fictional simplified localpart, with lower-cased, dots removed, only first part before "+" or "-" (typicaly catchall separators). When checking if an address is on the suppression list, it is checked against this address.')), dom.th('Manual', attr.title('Whether suppression was added manually, instead of automatically based on bounces.')), dom.th('Reason'), dom.th('Since'), dom.th('Action'))), dom.tbody((suppressions || []).length === 0 ? dom.tr(dom.td(attr.colspan('5'), '(None)')) : [], (suppressions || []).map(s => dom.tr(dom.td(s.OriginalAddress, attr.title(s.BaseAddress)), dom.td(s.Manual ? '✓' : ''), dom.td(s.Reason), dom.td(age(s.Created)), dom.td(dom.clickbutton('Remove', async function click(e) {
|
||||||
|
await check(e.target, client.SuppressionRemove(s.OriginalAddress));
|
||||||
|
window.location.reload(); // todo: reload less
|
||||||
|
}))))), dom.tfoot(dom.tr(dom.td(suppressionAddress = dom.input(attr.type('required'), attr.form('suppressionAdd'))), dom.td(), dom.td(suppressionReason = dom.input(style({ width: '100%' }), attr.form('suppressionAdd'))), dom.td(), dom.td(dom.submitbutton('Add suppression', attr.form('suppressionAdd')))))), dom.br(), dom.h2('Export'), dom.p('Export all messages in all mailboxes. In maildir or mbox format, as .zip or .tgz file.'), dom.table(dom._class('slim'), dom.tr(dom.td('Maildirs in .tgz'), dom.td(exportForm('mail-export-maildir.tgz'))), dom.tr(dom.td('Maildirs in .zip'), dom.td(exportForm('mail-export-maildir.zip'))), dom.tr(dom.td('Mbox files in .tgz'), dom.td(exportForm('mail-export-mbox.tgz'))), dom.tr(dom.td('Mbox files in .zip'), dom.td(exportForm('mail-export-mbox.zip')))), dom.br(), dom.h2('Import'), dom.p('Import messages from a .zip or .tgz file with maildirs and/or mbox files.'), importForm = dom.form(async function submit(e) {
|
||||||
e.preventDefault();
|
e.preventDefault();
|
||||||
e.stopPropagation();
|
e.stopPropagation();
|
||||||
const request = async () => {
|
const request = async () => {
|
||||||
|
@ -1054,7 +1506,7 @@ const index = async () => {
|
||||||
})), mailboxFileHint = dom.p(style({ display: 'none', fontStyle: 'italic', marginTop: '.5ex' }), 'This file must either be a zip file or a gzipped tar file with mbox and/or maildir mailboxes. For maildirs, an optional file "dovecot-keywords" is read additional keywords, like Forwarded/Junk/NotJunk. If an imported mailbox already exists by name, messages are added to the existing mailbox. If a mailbox does not yet exist it will be created.')), dom.div(style({ marginBottom: '1ex' }), dom.label(dom.div(style({ marginBottom: '.5ex' }), 'Skip mailbox prefix (optional)'), dom.input(attr.name('skipMailboxPrefix'), function focus() {
|
})), mailboxFileHint = dom.p(style({ display: 'none', fontStyle: 'italic', marginTop: '.5ex' }), 'This file must either be a zip file or a gzipped tar file with mbox and/or maildir mailboxes. For maildirs, an optional file "dovecot-keywords" is read additional keywords, like Forwarded/Junk/NotJunk. If an imported mailbox already exists by name, messages are added to the existing mailbox. If a mailbox does not yet exist it will be created.')), dom.div(style({ marginBottom: '1ex' }), dom.label(dom.div(style({ marginBottom: '.5ex' }), 'Skip mailbox prefix (optional)'), dom.input(attr.name('skipMailboxPrefix'), function focus() {
|
||||||
mailboxPrefixHint.style.display = '';
|
mailboxPrefixHint.style.display = '';
|
||||||
})), mailboxPrefixHint = dom.p(style({ display: 'none', fontStyle: 'italic', marginTop: '.5ex' }), 'If set, any mbox/maildir path with this prefix will have it stripped before importing. For example, if all mailboxes are in a directory "Takeout", specify that path in the field above so mailboxes like "Takeout/Inbox.mbox" are imported into a mailbox called "Inbox" instead of "Takeout/Inbox".')), dom.div(dom.submitbutton('Upload and import'), dom.p(style({ fontStyle: 'italic', marginTop: '.5ex' }), 'The file is uploaded first, then its messages are imported, finally messages are matched for threading. Importing is done in a transaction, you can abort the entire import before it is finished.')))), importAbortBox = dom.div(), // Outside fieldset because it gets disabled, above progress because may be scrolling it down quickly with problems.
|
})), mailboxPrefixHint = dom.p(style({ display: 'none', fontStyle: 'italic', marginTop: '.5ex' }), 'If set, any mbox/maildir path with this prefix will have it stripped before importing. For example, if all mailboxes are in a directory "Takeout", specify that path in the field above so mailboxes like "Takeout/Inbox.mbox" are imported into a mailbox called "Inbox" instead of "Takeout/Inbox".')), dom.div(dom.submitbutton('Upload and import'), dom.p(style({ fontStyle: 'italic', marginTop: '.5ex' }), 'The file is uploaded first, then its messages are imported, finally messages are matched for threading. Importing is done in a transaction, you can abort the entire import before it is finished.')))), importAbortBox = dom.div(), // Outside fieldset because it gets disabled, above progress because may be scrolling it down quickly with problems.
|
||||||
importProgress = dom.div(style({ display: 'none' })), footer);
|
importProgress = dom.div(style({ display: 'none' })), dom.br(), footer);
|
||||||
// Try to show the progress of an earlier import session. The user may have just
|
// Try to show the progress of an earlier import session. The user may have just
|
||||||
// refreshed the browser.
|
// refreshed the browser.
|
||||||
let importToken;
|
let importToken;
|
||||||
|
|
|
@ -84,6 +84,48 @@ const login = async (reason: string) => {
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Popup shows kids in a centered div with white background on top of a
|
||||||
|
// transparent overlay on top of the window. Clicking the overlay or hitting
|
||||||
|
// Escape closes the popup. Scrollbars are automatically added to the div with
|
||||||
|
// kids. Returns a function that removes the popup.
|
||||||
|
const popup = (...kids: ElemArg[]) => {
|
||||||
|
const origFocus = document.activeElement
|
||||||
|
const close = () => {
|
||||||
|
if (!root.parentNode) {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
root.remove()
|
||||||
|
if (origFocus && origFocus instanceof HTMLElement && origFocus.parentNode) {
|
||||||
|
origFocus.focus()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
let content: HTMLElement
|
||||||
|
const root = dom.div(
|
||||||
|
style({position: 'fixed', top: 0, right: 0, bottom: 0, left: 0, backgroundColor: 'rgba(0, 0, 0, 0.1)', display: 'flex', alignItems: 'center', justifyContent: 'center', zIndex: '1'}),
|
||||||
|
function keydown(e: KeyboardEvent) {
|
||||||
|
if (e.key === 'Escape') {
|
||||||
|
e.stopPropagation()
|
||||||
|
close()
|
||||||
|
}
|
||||||
|
},
|
||||||
|
function click(e: MouseEvent) {
|
||||||
|
e.stopPropagation()
|
||||||
|
close()
|
||||||
|
},
|
||||||
|
content=dom.div(
|
||||||
|
attr.tabindex('0'),
|
||||||
|
style({backgroundColor: 'white', borderRadius: '.25em', padding: '1em', boxShadow: '0 0 20px rgba(0, 0, 0, 0.1)', border: '1px solid #ddd', maxWidth: '95vw', overflowX: 'auto', maxHeight: '95vh', overflowY: 'auto'}),
|
||||||
|
function click(e: MouseEvent) {
|
||||||
|
e.stopPropagation()
|
||||||
|
},
|
||||||
|
kids,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
document.body.appendChild(root)
|
||||||
|
content.focus()
|
||||||
|
return close
|
||||||
|
}
|
||||||
|
|
||||||
const localStorageGet = (k: string): string | null => {
|
const localStorageGet = (k: string): string | null => {
|
||||||
try {
|
try {
|
||||||
return window.localStorage.getItem(k)
|
return window.localStorage.getItem(k)
|
||||||
|
@ -195,6 +237,42 @@ const yellow = '#ffe400'
|
||||||
const red = '#ff7443'
|
const red = '#ff7443'
|
||||||
const blue = '#8bc8ff'
|
const blue = '#8bc8ff'
|
||||||
|
|
||||||
|
const age = (date: Date) => {
|
||||||
|
const r = dom.span(dom._class('notooltip'), attr.title(date.toString()))
|
||||||
|
const nowSecs = new Date().getTime()/1000
|
||||||
|
let t = nowSecs - date.getTime()/1000
|
||||||
|
let negative = ''
|
||||||
|
if (t < 0) {
|
||||||
|
negative = '-'
|
||||||
|
t = -t
|
||||||
|
}
|
||||||
|
const minute = 60
|
||||||
|
const hour = 60*minute
|
||||||
|
const day = 24*hour
|
||||||
|
const month = 30*day
|
||||||
|
const year = 365*day
|
||||||
|
const periods = [year, month, day, hour, minute]
|
||||||
|
const suffix = ['y', 'mo', 'd', 'h', 'min']
|
||||||
|
let s
|
||||||
|
for (let i = 0; i < periods.length; i++) {
|
||||||
|
const p = periods[i]
|
||||||
|
if (t >= 2*p || i === periods.length-1) {
|
||||||
|
const n = Math.round(t/p)
|
||||||
|
s = '' + n + suffix[i]
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (t < 60) {
|
||||||
|
s = '<1min'
|
||||||
|
// Prevent showing '-<1min' when browser and server have relatively small time drift of max 1 minute.
|
||||||
|
negative = ''
|
||||||
|
}
|
||||||
|
|
||||||
|
dom._kids(r, negative+s)
|
||||||
|
return r
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
const formatQuotaSize = (v: number) => {
|
const formatQuotaSize = (v: number) => {
|
||||||
if (v === 0) {
|
if (v === 0) {
|
||||||
return '0'
|
return '0'
|
||||||
|
@ -213,7 +291,7 @@ const formatQuotaSize = (v: number) => {
|
||||||
}
|
}
|
||||||
|
|
||||||
const index = async () => {
|
const index = async () => {
|
||||||
const [acc, storageUsed, storageLimit] = await client.Account()
|
const [acc, storageUsed, storageLimit, suppressions] = await client.Account()
|
||||||
|
|
||||||
let fullNameForm: HTMLFormElement
|
let fullNameForm: HTMLFormElement
|
||||||
let fullNameFieldset: HTMLFieldSetElement
|
let fullNameFieldset: HTMLFieldSetElement
|
||||||
|
@ -224,6 +302,55 @@ const index = async () => {
|
||||||
let password2: HTMLInputElement
|
let password2: HTMLInputElement
|
||||||
let passwordHint: HTMLElement
|
let passwordHint: HTMLElement
|
||||||
|
|
||||||
|
let outgoingWebhookFieldset: HTMLFieldSetElement
|
||||||
|
let outgoingWebhookURL: HTMLInputElement
|
||||||
|
let outgoingWebhookAuthorization: HTMLInputElement
|
||||||
|
let outgoingWebhookEvents: HTMLSelectElement
|
||||||
|
|
||||||
|
let incomingWebhookFieldset: HTMLFieldSetElement
|
||||||
|
let incomingWebhookURL: HTMLInputElement
|
||||||
|
let incomingWebhookAuthorization: HTMLInputElement
|
||||||
|
|
||||||
|
let keepRetiredPeriodsFieldset: HTMLFieldSetElement
|
||||||
|
let keepRetiredMessagePeriod: HTMLInputElement
|
||||||
|
let keepRetiredWebhookPeriod: HTMLInputElement
|
||||||
|
|
||||||
|
let fromIDLoginAddressesFieldset: HTMLFieldSetElement
|
||||||
|
|
||||||
|
const second = 1000*1000*1000
|
||||||
|
const minute = 60*second
|
||||||
|
const hour = 60*minute
|
||||||
|
const day = 24*hour
|
||||||
|
const week = 7*day
|
||||||
|
const parseDuration = (s: string) => {
|
||||||
|
if (!s) { return 0 }
|
||||||
|
const xparseint = () => {
|
||||||
|
const v = parseInt(s.substring(0, s.length-1))
|
||||||
|
if (isNaN(v) || Math.round(v) !== v) {
|
||||||
|
throw new Error('bad number in duration')
|
||||||
|
}
|
||||||
|
return v
|
||||||
|
}
|
||||||
|
if (s.endsWith('w')) { return xparseint()*week }
|
||||||
|
if (s.endsWith('d')) { return xparseint()*day }
|
||||||
|
if (s.endsWith('h')) { return xparseint()*hour }
|
||||||
|
if (s.endsWith('m')) { return xparseint()*minute }
|
||||||
|
if (s.endsWith('s')) { return xparseint()*second }
|
||||||
|
throw new Error('bad duration '+s)
|
||||||
|
}
|
||||||
|
const formatDuration = (v: number) => {
|
||||||
|
if (v === 0) {
|
||||||
|
return ''
|
||||||
|
}
|
||||||
|
const is = (period: number) => v > 0 && Math.round(v/period) === v/period
|
||||||
|
const format = (period: number, s: string) => ''+(v/period)+s
|
||||||
|
if (is(week)) { return format(week, 'w') }
|
||||||
|
if (is(day)) { return format(day, 'd') }
|
||||||
|
if (is(hour)) { return format(hour, 'h') }
|
||||||
|
if (is(minute)) { return format(minute, 'm') }
|
||||||
|
return format(second, 's')
|
||||||
|
}
|
||||||
|
|
||||||
let importForm: HTMLFormElement
|
let importForm: HTMLFormElement
|
||||||
let importFieldset: HTMLFieldSetElement
|
let importFieldset: HTMLFieldSetElement
|
||||||
let mailboxFileHint: HTMLElement
|
let mailboxFileHint: HTMLElement
|
||||||
|
@ -231,6 +358,9 @@ const index = async () => {
|
||||||
let importProgress: HTMLElement
|
let importProgress: HTMLElement
|
||||||
let importAbortBox: HTMLElement
|
let importAbortBox: HTMLElement
|
||||||
|
|
||||||
|
let suppressionAddress: HTMLInputElement
|
||||||
|
let suppressionReason: HTMLInputElement
|
||||||
|
|
||||||
const importTrack = async (token: string) => {
|
const importTrack = async (token: string) => {
|
||||||
const importConnection = dom.div('Waiting for updates...')
|
const importConnection = dom.div('Waiting for updates...')
|
||||||
importProgress.appendChild(importConnection)
|
importProgress.appendChild(importConnection)
|
||||||
|
@ -345,6 +475,252 @@ const index = async () => {
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const authorizationPopup = (dest: HTMLInputElement) => {
|
||||||
|
let username: HTMLInputElement
|
||||||
|
let password: HTMLInputElement
|
||||||
|
const close = popup(
|
||||||
|
dom.form(
|
||||||
|
function submit(e: SubmitEvent) {
|
||||||
|
e.preventDefault()
|
||||||
|
e.stopPropagation()
|
||||||
|
dest.value = 'Basic '+window.btoa(username.value+':'+password.value)
|
||||||
|
close()
|
||||||
|
},
|
||||||
|
dom.p('Compose HTTP Basic authentication header'),
|
||||||
|
dom.div(
|
||||||
|
style({marginBottom: '1ex'}),
|
||||||
|
dom.div(dom.label('Username')),
|
||||||
|
username=dom.input(attr.required('')),
|
||||||
|
),
|
||||||
|
dom.div(
|
||||||
|
style({marginBottom: '1ex'}),
|
||||||
|
dom.div(dom.label('Password (shown in clear)')),
|
||||||
|
password=dom.input(attr.required('')),
|
||||||
|
),
|
||||||
|
dom.div(
|
||||||
|
style({marginBottom: '1ex'}),
|
||||||
|
dom.submitbutton('Set'),
|
||||||
|
),
|
||||||
|
dom.div('A HTTP Basic authorization header contains the password in plain text, as base64.'),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
username.focus()
|
||||||
|
}
|
||||||
|
|
||||||
|
const popupTestOutgoing = () => {
|
||||||
|
let fieldset: HTMLFieldSetElement
|
||||||
|
let event: HTMLSelectElement
|
||||||
|
let dsn: HTMLInputElement
|
||||||
|
let suppressing: HTMLInputElement
|
||||||
|
let queueMsgID: HTMLInputElement
|
||||||
|
let fromID: HTMLInputElement
|
||||||
|
let messageID: HTMLInputElement
|
||||||
|
let error: HTMLInputElement
|
||||||
|
let extra: HTMLInputElement
|
||||||
|
let body: HTMLTextAreaElement
|
||||||
|
let curl: HTMLElement
|
||||||
|
let result: HTMLElement
|
||||||
|
|
||||||
|
let data: api.Outgoing = {
|
||||||
|
Version: 0,
|
||||||
|
Event: api.OutgoingEvent.EventDelivered,
|
||||||
|
DSN: false,
|
||||||
|
Suppressing: false,
|
||||||
|
QueueMsgID: 123,
|
||||||
|
FromID: 'MDEyMzQ1Njc4OWFiY2RlZg',
|
||||||
|
MessageID: '<QnxzgulZK51utga6agH_rg@mox.example>',
|
||||||
|
Subject: 'test from mox web pages',
|
||||||
|
WebhookQueued: new Date(),
|
||||||
|
SMTPCode: 0,
|
||||||
|
SMTPEnhancedCode: '',
|
||||||
|
Error: '',
|
||||||
|
Extra: {},
|
||||||
|
}
|
||||||
|
const onchange = function change() {
|
||||||
|
data = {
|
||||||
|
Version: 0,
|
||||||
|
Event: event.value as api.OutgoingEvent,
|
||||||
|
DSN: dsn.checked,
|
||||||
|
Suppressing: suppressing.checked,
|
||||||
|
QueueMsgID: parseInt(queueMsgID.value),
|
||||||
|
FromID: fromID.value,
|
||||||
|
MessageID: messageID.value,
|
||||||
|
Subject: 'test from mox web pages',
|
||||||
|
WebhookQueued: new Date(),
|
||||||
|
SMTPCode: 0,
|
||||||
|
SMTPEnhancedCode: '',
|
||||||
|
Error: error.value,
|
||||||
|
Extra: JSON.parse(extra.value),
|
||||||
|
}
|
||||||
|
const curlStr = "curl " + (outgoingWebhookAuthorization.value ? "-H 'Authorization: "+outgoingWebhookAuthorization.value+"' " : "") + "-H 'X-Mox-Webhook-ID: 1' -H 'X-Mox-Webhook-Attempt: 1' --json '"+JSON.stringify(data)+"' '"+outgoingWebhookURL.value+"'"
|
||||||
|
dom._kids(curl, style({maxWidth: '45em', wordBreak: 'break-all'}), curlStr)
|
||||||
|
body.value = JSON.stringify(data, undefined, "\t")
|
||||||
|
}
|
||||||
|
|
||||||
|
popup(
|
||||||
|
dom.h1('Test webhook for outgoing delivery'),
|
||||||
|
dom.form(
|
||||||
|
async function submit(e: SubmitEvent) {
|
||||||
|
e.preventDefault()
|
||||||
|
e.stopPropagation()
|
||||||
|
result.classList.add('loadstart')
|
||||||
|
const [code, response, errmsg] = await check(fieldset, client.OutgoingWebhookTest(outgoingWebhookURL.value, outgoingWebhookAuthorization.value, data))
|
||||||
|
const nresult = dom.div(
|
||||||
|
dom._class('loadend'),
|
||||||
|
dom.table(
|
||||||
|
dom.tr(dom.td('HTTP status code'), dom.td(''+code)),
|
||||||
|
dom.tr(dom.td('Error message'), dom.td(errmsg)),
|
||||||
|
dom.tr(dom.td('Response'), dom.td(response)),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
result.replaceWith(nresult)
|
||||||
|
result = nresult
|
||||||
|
},
|
||||||
|
fieldset=dom.fieldset(
|
||||||
|
dom.p('Make a test call to ', dom.b(outgoingWebhookURL.value), '.'),
|
||||||
|
dom.div(style({display: 'flex', gap: '1em'}),
|
||||||
|
dom.div(
|
||||||
|
dom.h2('Parameters'),
|
||||||
|
dom.div(
|
||||||
|
style({marginBottom: '.5ex'}),
|
||||||
|
dom.label(
|
||||||
|
'Event',
|
||||||
|
dom.div(
|
||||||
|
event=dom.select(onchange,
|
||||||
|
["delivered", "suppressed", "delayed", "failed", "relayed", "expanded", "canceled", "unrecognized"].map(s => dom.option(s.substring(0, 1).toUpperCase()+s.substring(1), attr.value(s))),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
dom.div(style({marginBottom: '.5ex'}), dom.label(dsn=dom.input(attr.type('checkbox')), ' DSN', onchange)),
|
||||||
|
dom.div(style({marginBottom: '.5ex'}), dom.label(suppressing=dom.input(attr.type('checkbox')), ' Suppressing', onchange)),
|
||||||
|
dom.div(style({marginBottom: '.5ex'}), dom.label('Queue message ID ', dom.div(queueMsgID=dom.input(attr.required(''), attr.type('number'), attr.value('123'), onchange)))),
|
||||||
|
dom.div(style({marginBottom: '.5ex'}), dom.label('From ID ', dom.div(fromID=dom.input(attr.required(''), attr.value(data.FromID), onchange)))),
|
||||||
|
dom.div(style({marginBottom: '.5ex'}), dom.label('MessageID', dom.div(messageID=dom.input(attr.required(''), attr.value(data.MessageID), onchange)))),
|
||||||
|
dom.div(style({marginBottom: '.5ex'}), dom.label('Error', dom.div(error=dom.input(onchange)))),
|
||||||
|
dom.div(style({marginBottom: '.5ex'}), dom.label('Extra', dom.div(extra=dom.input(attr.required(''), attr.value('{}'), onchange)))),
|
||||||
|
),
|
||||||
|
dom.div(
|
||||||
|
dom.h2('Headers'),
|
||||||
|
dom.pre('X-Mox-Webhook-ID: 1\nX-Mox-Webhook-Attempt: 1'),
|
||||||
|
dom.br(),
|
||||||
|
dom.h2('JSON'),
|
||||||
|
body=dom.textarea(attr.disabled(''), attr.rows('15'), style({width: '30em'})),
|
||||||
|
dom.br(),
|
||||||
|
dom.h2('curl'),
|
||||||
|
curl=dom.div(dom._class('literal')),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
dom.br(),
|
||||||
|
dom.div(style({textAlign: 'right'}), dom.submitbutton('Post')),
|
||||||
|
dom.br(),
|
||||||
|
result=dom.div(),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
onchange()
|
||||||
|
}
|
||||||
|
|
||||||
|
const popupTestIncoming = () => {
|
||||||
|
let fieldset: HTMLFieldSetElement
|
||||||
|
let body: HTMLTextAreaElement
|
||||||
|
let curl: HTMLElement
|
||||||
|
let result: HTMLElement
|
||||||
|
|
||||||
|
let data: api.Incoming = {
|
||||||
|
Version: 0,
|
||||||
|
From: [{Name: 'remote', Address: 'remote@remote.example'}],
|
||||||
|
To: [{Name: 'mox', Address: 'mox@mox.example'}],
|
||||||
|
CC: [],
|
||||||
|
BCC: [],
|
||||||
|
ReplyTo: [],
|
||||||
|
Subject: 'test webhook for incoming message',
|
||||||
|
MessageID: '<QnxzgulZK51utga6agH_rg@mox.example>',
|
||||||
|
InReplyTo: '',
|
||||||
|
References: [],
|
||||||
|
Date: new Date(),
|
||||||
|
Text: 'hi ☺\n',
|
||||||
|
HTML: '',
|
||||||
|
Structure: {
|
||||||
|
ContentType: 'text/plain',
|
||||||
|
ContentTypeParams: {charset: 'utf-8'},
|
||||||
|
ContentID: '',
|
||||||
|
DecodedSize: 8,
|
||||||
|
Parts: [],
|
||||||
|
},
|
||||||
|
Meta: {
|
||||||
|
MsgID: 1,
|
||||||
|
MailFrom: 'remote@remote.example',
|
||||||
|
MailFromValidated: true,
|
||||||
|
MsgFromValidated: true,
|
||||||
|
RcptTo: 'mox@localhost',
|
||||||
|
DKIMVerifiedDomains: ['remote.example'],
|
||||||
|
RemoteIP: '127.0.0.1',
|
||||||
|
Received: new Date(),
|
||||||
|
MailboxName: 'Inbox',
|
||||||
|
Automated: false,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
const onchange = function change() {
|
||||||
|
try {
|
||||||
|
api.parser.Incoming(JSON.parse(body.value))
|
||||||
|
} catch (err) {
|
||||||
|
console.log({err})
|
||||||
|
window.alert('Error parsing data: '+errmsg(err))
|
||||||
|
}
|
||||||
|
const curlStr = "curl " + (incomingWebhookAuthorization.value ? "-H 'Authorization: "+incomingWebhookAuthorization.value+"' " : "") + "-H 'X-Mox-Webhook-ID: 1' -H 'X-Mox-Webhook-Attempt: 1' --json '"+JSON.stringify(data)+"' '"+incomingWebhookURL.value+"'"
|
||||||
|
dom._kids(curl, style({maxWidth: '45em', wordBreak: 'break-all'}), curlStr)
|
||||||
|
}
|
||||||
|
|
||||||
|
popup(
|
||||||
|
dom.h1('Test webhook for incoming delivery'),
|
||||||
|
dom.form(
|
||||||
|
async function submit(e: SubmitEvent) {
|
||||||
|
e.preventDefault()
|
||||||
|
e.stopPropagation()
|
||||||
|
result.classList.add('loadstart')
|
||||||
|
const [code, response, errmsg] = await check(fieldset, (async () => await client.IncomingWebhookTest(incomingWebhookURL.value, incomingWebhookAuthorization.value, api.parser.Incoming(JSON.parse(body.value))))())
|
||||||
|
const nresult = dom.div(
|
||||||
|
dom._class('loadend'),
|
||||||
|
dom.table(
|
||||||
|
dom.tr(dom.td('HTTP status code'), dom.td(''+code)),
|
||||||
|
dom.tr(dom.td('Error message'), dom.td(errmsg)),
|
||||||
|
dom.tr(dom.td('Response'), dom.td(response)),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
result.replaceWith(nresult)
|
||||||
|
result = nresult
|
||||||
|
},
|
||||||
|
fieldset=dom.fieldset(
|
||||||
|
dom.p('Make a test call to ', dom.b(incomingWebhookURL.value), '.'),
|
||||||
|
dom.div(style({display: 'flex', gap: '1em'}),
|
||||||
|
dom.div(
|
||||||
|
dom.h2('JSON'),
|
||||||
|
body=dom.textarea(style({maxHeight: '90vh'}), style({width: '30em'}), onchange),
|
||||||
|
),
|
||||||
|
dom.div(
|
||||||
|
dom.h2('Headers'),
|
||||||
|
dom.pre('X-Mox-Webhook-ID: 1\nX-Mox-Webhook-Attempt: 1'),
|
||||||
|
dom.br(),
|
||||||
|
|
||||||
|
dom.h2('curl'),
|
||||||
|
curl=dom.div(dom._class('literal')),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
dom.br(),
|
||||||
|
dom.div(style({textAlign: 'right'}), dom.submitbutton('Post')),
|
||||||
|
dom.br(),
|
||||||
|
result=dom.div(),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
body.value = JSON.stringify(data, undefined, '\t')
|
||||||
|
body.setAttribute('rows', ''+Math.min(40, (body.value.split('\n').length+1)))
|
||||||
|
onchange()
|
||||||
|
}
|
||||||
|
|
||||||
dom._kids(page,
|
dom._kids(page,
|
||||||
crumbs('Mox Account'),
|
crumbs('Mox Account'),
|
||||||
dom.p('NOTE: Not all account settings can be configured through these pages yet. See the configuration file for more options.'),
|
dom.p('NOTE: Not all account settings can be configured through these pages yet. See the configuration file for more options.'),
|
||||||
|
@ -386,6 +762,7 @@ const index = async () => {
|
||||||
),
|
),
|
||||||
),
|
),
|
||||||
dom.br(),
|
dom.br(),
|
||||||
|
|
||||||
dom.h2('Change password'),
|
dom.h2('Change password'),
|
||||||
passwordForm=dom.form(
|
passwordForm=dom.form(
|
||||||
passwordFieldset=dom.fieldset(
|
passwordFieldset=dom.fieldset(
|
||||||
|
@ -442,6 +819,7 @@ const index = async () => {
|
||||||
},
|
},
|
||||||
),
|
),
|
||||||
dom.br(),
|
dom.br(),
|
||||||
|
|
||||||
dom.h2('Disk usage'),
|
dom.h2('Disk usage'),
|
||||||
dom.p('Storage used is ', dom.b(formatQuotaSize(Math.floor(storageUsed/(1024*1024))*1024*1024)),
|
dom.p('Storage used is ', dom.b(formatQuotaSize(Math.floor(storageUsed/(1024*1024))*1024*1024)),
|
||||||
storageLimit > 0 ? [
|
storageLimit > 0 ? [
|
||||||
|
@ -450,6 +828,256 @@ const index = async () => {
|
||||||
''+Math.floor(100*storageUsed/storageLimit),
|
''+Math.floor(100*storageUsed/storageLimit),
|
||||||
'%).',
|
'%).',
|
||||||
] : [', no explicit limit is configured.']),
|
] : [', no explicit limit is configured.']),
|
||||||
|
|
||||||
|
dom.h2('Webhooks'),
|
||||||
|
dom.h3('Outgoing', attr.title('Webhooks for outgoing messages are called for each attempt to deliver a message in the outgoing queue, e.g. when the queue has delivered a message to the next hop, when a single attempt failed with a temporary error, when delivery permanently failed, or when DSN (delivery status notification) messages were received about a previously sent message.')),
|
||||||
|
dom.form(
|
||||||
|
async function submit(e: SubmitEvent) {
|
||||||
|
e.preventDefault()
|
||||||
|
e.stopPropagation()
|
||||||
|
|
||||||
|
await check(outgoingWebhookFieldset, client.OutgoingWebhookSave(outgoingWebhookURL.value, outgoingWebhookAuthorization.value, [...outgoingWebhookEvents.selectedOptions].map(o => o.value)))
|
||||||
|
},
|
||||||
|
outgoingWebhookFieldset=dom.fieldset(
|
||||||
|
dom.div(style({display: 'flex', gap: '1em'}),
|
||||||
|
dom.div(
|
||||||
|
dom.label(
|
||||||
|
dom.div('URL', attr.title('URL to do an HTTP POST to for each event. Webhooks are disabled if empty.')),
|
||||||
|
outgoingWebhookURL=dom.input(attr.value(acc.OutgoingWebhook?.URL || ''), style({width: '30em'})),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
dom.div(
|
||||||
|
dom.label(
|
||||||
|
dom.div(
|
||||||
|
'Authorization header ',
|
||||||
|
dom.a(
|
||||||
|
'Basic',
|
||||||
|
attr.href(''),
|
||||||
|
function click(e: MouseEvent) {
|
||||||
|
e.preventDefault()
|
||||||
|
authorizationPopup(outgoingWebhookAuthorization)
|
||||||
|
},
|
||||||
|
),
|
||||||
|
attr.title('If non-empty, HTTP requests have this value as Authorization header, e.g. Basic <base64-encoded-username-password>.'),
|
||||||
|
),
|
||||||
|
outgoingWebhookAuthorization=dom.input(attr.value(acc.OutgoingWebhook?.Authorization || '')),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
dom.div(
|
||||||
|
dom.label(
|
||||||
|
style({verticalAlign: 'top'}),
|
||||||
|
dom.div('Events', attr.title('Either limit to specific events, or receive all events (default).')),
|
||||||
|
outgoingWebhookEvents=dom.select(
|
||||||
|
style({verticalAlign: 'bottom'}),
|
||||||
|
attr.multiple(''),
|
||||||
|
attr.size('8'), // Number of options.
|
||||||
|
["delivered", "suppressed", "delayed", "failed", "relayed", "expanded", "canceled", "unrecognized"].map(s => dom.option(s.substring(0, 1).toUpperCase()+s.substring(1), attr.value(s), acc.OutgoingWebhook?.Events?.includes(s) ? attr.selected('') : [])),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
dom.div(
|
||||||
|
dom.div(dom.label('\u00a0')),
|
||||||
|
dom.submitbutton('Save'), ' ',
|
||||||
|
dom.clickbutton('Test', function click() {
|
||||||
|
popupTestOutgoing()
|
||||||
|
}),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
dom.br(),
|
||||||
|
dom.h3('Incoming', attr.title('Webhooks for incoming messages are called for each message received over SMTP, excluding DSN messages about previous deliveries.')),
|
||||||
|
dom.form(
|
||||||
|
async function submit(e: SubmitEvent) {
|
||||||
|
e.preventDefault()
|
||||||
|
e.stopPropagation()
|
||||||
|
|
||||||
|
await check(incomingWebhookFieldset, client.IncomingWebhookSave(incomingWebhookURL.value, incomingWebhookAuthorization.value))
|
||||||
|
},
|
||||||
|
incomingWebhookFieldset=dom.fieldset(
|
||||||
|
dom.div(
|
||||||
|
style({display: 'flex', gap: '1em'}),
|
||||||
|
dom.div(
|
||||||
|
dom.label(
|
||||||
|
dom.div('URL'),
|
||||||
|
incomingWebhookURL=dom.input(attr.value(acc.IncomingWebhook?.URL || ''), style({width: '30em'})),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
dom.div(
|
||||||
|
dom.label(
|
||||||
|
dom.div(
|
||||||
|
'Authorization header ',
|
||||||
|
dom.a(
|
||||||
|
'Basic',
|
||||||
|
attr.href(''),
|
||||||
|
function click(e: MouseEvent) {
|
||||||
|
e.preventDefault()
|
||||||
|
authorizationPopup(incomingWebhookAuthorization)
|
||||||
|
},
|
||||||
|
),
|
||||||
|
attr.title('If non-empty, HTTP requests have this value as Authorization header, e.g. Basic <base64-encoded-username-password>.'),
|
||||||
|
),
|
||||||
|
incomingWebhookAuthorization=dom.input(attr.value(acc.IncomingWebhook?.Authorization || '')),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
dom.div(
|
||||||
|
dom.div(dom.label('\u00a0')),
|
||||||
|
dom.submitbutton('Save'), ' ',
|
||||||
|
dom.clickbutton('Test', function click() {
|
||||||
|
popupTestIncoming()
|
||||||
|
}),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
dom.br(),
|
||||||
|
|
||||||
|
dom.h2('Keep messages/webhooks retired from queue', attr.title('After delivering a message or webhook from the queue it is removed by default. But you can also keep these "retired" messages/webhooks around for a while. With unique SMTP MAIL FROM addresses configured below, this allows relating incoming delivery status notification messages (DSNs) to previously sent messages and their original recipients, which is needed for automatic management of recipient suppression lists, which is important for managing the reputation of your mail server. For both messages and webhooks, this can be useful for debugging. Use values like "3d" for 3 days, or units "s" for second, "m" for minute, "h" for hour, "w" for week.')),
|
||||||
|
dom.form(
|
||||||
|
async function submit(e: SubmitEvent) {
|
||||||
|
e.preventDefault()
|
||||||
|
e.stopPropagation()
|
||||||
|
|
||||||
|
await check(keepRetiredPeriodsFieldset, (async () => await client.KeepRetiredPeriodsSave(parseDuration(keepRetiredMessagePeriod.value), parseDuration(keepRetiredWebhookPeriod.value)))())
|
||||||
|
},
|
||||||
|
keepRetiredPeriodsFieldset=dom.fieldset(
|
||||||
|
dom.div(
|
||||||
|
style({display: 'flex', gap: '1em', alignItems: 'flex-end'}),
|
||||||
|
dom.div(
|
||||||
|
dom.label(
|
||||||
|
'Messages deliveries',
|
||||||
|
dom.br(),
|
||||||
|
keepRetiredMessagePeriod=dom.input(attr.value(formatDuration(acc.KeepRetiredMessagePeriod))),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
dom.div(
|
||||||
|
dom.label(
|
||||||
|
'Webhook deliveries',
|
||||||
|
dom.br(),
|
||||||
|
keepRetiredWebhookPeriod=dom.input(attr.value(formatDuration(acc.KeepRetiredWebhookPeriod))),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
dom.div(
|
||||||
|
dom.submitbutton('Save'),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
dom.br(),
|
||||||
|
|
||||||
|
dom.h2('Unique SMTP MAIL FROM login addresses', attr.title('Outgoing messages are normally sent using your email address in the SMTP MAIL FROM command. By using unique addresses (by using the localpart catchall separator, e.g. addresses of the form "localpart+<uniquefromid>@domain"), future incoming DSNs can be related to the original outgoing messages and recipients, which allows for automatic management of recipient suppression lists when keeping retired messages for as long as you expect DSNs to come in as configured above. Configure the addresses used for logging in with SMTP submission, the webapi or webmail for which unique SMTP MAIL FROM addesses should be enabled. Note: These are addresses used for authenticating, not the address in the message "From" header.')),
|
||||||
|
(() => {
|
||||||
|
let inputs: HTMLInputElement[] = []
|
||||||
|
let elem: HTMLElement
|
||||||
|
|
||||||
|
const render = () => {
|
||||||
|
inputs = []
|
||||||
|
|
||||||
|
const e = dom.form(
|
||||||
|
async function submit(e: SubmitEvent) {
|
||||||
|
e.preventDefault()
|
||||||
|
e.stopPropagation()
|
||||||
|
|
||||||
|
await check(fromIDLoginAddressesFieldset, client.FromIDLoginAddressesSave(inputs.map(e => e.value)))
|
||||||
|
},
|
||||||
|
fromIDLoginAddressesFieldset=dom.fieldset(
|
||||||
|
dom.table(
|
||||||
|
dom.tbody(
|
||||||
|
(acc.FromIDLoginAddresses || []).length === 0 ? dom.tr(dom.td('(None)'), dom.td()) : [],
|
||||||
|
(acc.FromIDLoginAddresses || []).map((s, index) => {
|
||||||
|
const input = dom.input(attr.required(''), attr.value(s))
|
||||||
|
inputs.push(input)
|
||||||
|
const x = dom.tr(
|
||||||
|
dom.td(input),
|
||||||
|
dom.td(
|
||||||
|
dom.clickbutton('Remove', function click() {
|
||||||
|
acc.FromIDLoginAddresses!.splice(index, 1)
|
||||||
|
render()
|
||||||
|
}),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
return x
|
||||||
|
}),
|
||||||
|
),
|
||||||
|
dom.tfoot(
|
||||||
|
dom.tr(
|
||||||
|
dom.td(),
|
||||||
|
dom.td(
|
||||||
|
dom.clickbutton('Add', function click() {
|
||||||
|
acc.FromIDLoginAddresses = (acc.FromIDLoginAddresses || []).concat([''])
|
||||||
|
render()
|
||||||
|
}),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
dom.tr(
|
||||||
|
dom.td(attr.colspan('2'), dom.submitbutton('Save')),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
if (elem) {
|
||||||
|
elem.replaceWith(e)
|
||||||
|
elem = e
|
||||||
|
}
|
||||||
|
return e
|
||||||
|
}
|
||||||
|
elem = render()
|
||||||
|
return elem
|
||||||
|
})(),
|
||||||
|
dom.br(),
|
||||||
|
|
||||||
|
dom.h2('Suppression list'),
|
||||||
|
dom.p('Messages queued for delivery to recipients on the suppression list will immediately fail. If delivery to a recipient fails repeatedly, it can be added to the suppression list automatically. Repeated rejected delivery attempts can have a negative influence of mail server reputation. Applications sending email can implement their own handling of delivery failure notifications, but not all do.'),
|
||||||
|
dom.form(
|
||||||
|
attr.id('suppressionAdd'),
|
||||||
|
async function submit(e: SubmitEvent) {
|
||||||
|
e.preventDefault()
|
||||||
|
e.stopPropagation()
|
||||||
|
|
||||||
|
await check(e.target! as HTMLButtonElement, client.SuppressionAdd(suppressionAddress.value, true, suppressionReason.value))
|
||||||
|
window.location.reload() // todo: reload less
|
||||||
|
},
|
||||||
|
),
|
||||||
|
dom.table(
|
||||||
|
dom.thead(
|
||||||
|
dom.tr(
|
||||||
|
dom.th('Address', attr.title('Address that caused this entry to be added to the list. The title (shown on hover) displays an address with a fictional simplified localpart, with lower-cased, dots removed, only first part before "+" or "-" (typicaly catchall separators). When checking if an address is on the suppression list, it is checked against this address.')),
|
||||||
|
dom.th('Manual', attr.title('Whether suppression was added manually, instead of automatically based on bounces.')),
|
||||||
|
dom.th('Reason'),
|
||||||
|
dom.th('Since'),
|
||||||
|
dom.th('Action'),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
dom.tbody(
|
||||||
|
(suppressions || []).length === 0 ? dom.tr(dom.td(attr.colspan('5'), '(None)')) : [],
|
||||||
|
(suppressions || []).map(s =>
|
||||||
|
dom.tr(
|
||||||
|
dom.td(s.OriginalAddress, attr.title(s.BaseAddress)),
|
||||||
|
dom.td(s.Manual ? '✓' : ''),
|
||||||
|
dom.td(s.Reason),
|
||||||
|
dom.td(age(s.Created)),
|
||||||
|
dom.td(
|
||||||
|
dom.clickbutton('Remove', async function click(e: MouseEvent) {
|
||||||
|
await check(e.target! as HTMLButtonElement, client.SuppressionRemove(s.OriginalAddress))
|
||||||
|
window.location.reload() // todo: reload less
|
||||||
|
})
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
dom.tfoot(
|
||||||
|
dom.tr(
|
||||||
|
dom.td(suppressionAddress=dom.input(attr.type('required'), attr.form('suppressionAdd'))),
|
||||||
|
dom.td(),
|
||||||
|
dom.td(suppressionReason=dom.input(style({width: '100%'}), attr.form('suppressionAdd'))),
|
||||||
|
dom.td(),
|
||||||
|
dom.td(dom.submitbutton('Add suppression', attr.form('suppressionAdd'))),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
dom.br(),
|
||||||
|
|
||||||
dom.h2('Export'),
|
dom.h2('Export'),
|
||||||
dom.p('Export all messages in all mailboxes. In maildir or mbox format, as .zip or .tgz file.'),
|
dom.p('Export all messages in all mailboxes. In maildir or mbox format, as .zip or .tgz file.'),
|
||||||
dom.table(dom._class('slim'),
|
dom.table(dom._class('slim'),
|
||||||
|
@ -471,6 +1099,7 @@ const index = async () => {
|
||||||
),
|
),
|
||||||
),
|
),
|
||||||
dom.br(),
|
dom.br(),
|
||||||
|
|
||||||
dom.h2('Import'),
|
dom.h2('Import'),
|
||||||
dom.p('Import messages from a .zip or .tgz file with maildirs and/or mbox files.'),
|
dom.p('Import messages from a .zip or .tgz file with maildirs and/or mbox files.'),
|
||||||
importForm=dom.form(
|
importForm=dom.form(
|
||||||
|
@ -570,6 +1199,8 @@ const index = async () => {
|
||||||
importProgress=dom.div(
|
importProgress=dom.div(
|
||||||
style({display: 'none'}),
|
style({display: 'none'}),
|
||||||
),
|
),
|
||||||
|
dom.br(),
|
||||||
|
|
||||||
footer,
|
footer,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -744,6 +1375,7 @@ const destination = async (name: string) => {
|
||||||
fullName=dom.input(attr.value(dest.FullName)),
|
fullName=dom.input(attr.value(dest.FullName)),
|
||||||
),
|
),
|
||||||
dom.br(),
|
dom.br(),
|
||||||
|
|
||||||
dom.h2('Rulesets'),
|
dom.h2('Rulesets'),
|
||||||
dom.p('Incoming messages are checked against the rulesets. If a ruleset matches, the message is delivered to the mailbox configured for the ruleset instead of to the default mailbox.'),
|
dom.p('Incoming messages are checked against the rulesets. If a ruleset matches, the message is delivered to the mailbox configured for the ruleset instead of to the default mailbox.'),
|
||||||
dom.p('"Is Forward" does not affect matching, but changes prevents the sending mail server from being included in future junk classifications by clearing fields related to the forwarding email server (IP address, EHLO domain, MAIL FROM domain and a matching DKIM domain), and prevents DMARC rejects for forwarded messages.'),
|
dom.p('"Is Forward" does not affect matching, but changes prevents the sending mail server from being included in future junk classifications by clearing fields related to the forwarding email server (IP address, EHLO domain, MAIL FROM domain and a matching DKIM domain), and prevents DMARC rejects for forwarded messages.'),
|
||||||
|
|
|
@ -16,18 +16,22 @@ import (
|
||||||
"os"
|
"os"
|
||||||
"path"
|
"path"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
|
"reflect"
|
||||||
"runtime/debug"
|
"runtime/debug"
|
||||||
"sort"
|
"sort"
|
||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
"time"
|
||||||
|
|
||||||
"github.com/mjl-/bstore"
|
"github.com/mjl-/bstore"
|
||||||
"github.com/mjl-/sherpa"
|
"github.com/mjl-/sherpa"
|
||||||
|
|
||||||
"github.com/mjl-/mox/mlog"
|
"github.com/mjl-/mox/mlog"
|
||||||
"github.com/mjl-/mox/mox-"
|
"github.com/mjl-/mox/mox-"
|
||||||
|
"github.com/mjl-/mox/queue"
|
||||||
"github.com/mjl-/mox/store"
|
"github.com/mjl-/mox/store"
|
||||||
"github.com/mjl-/mox/webauth"
|
"github.com/mjl-/mox/webauth"
|
||||||
|
"github.com/mjl-/mox/webhook"
|
||||||
)
|
)
|
||||||
|
|
||||||
var ctxbg = context.Background()
|
var ctxbg = context.Background()
|
||||||
|
@ -73,6 +77,13 @@ func tneedErrorCode(t *testing.T, code string, fn func()) {
|
||||||
fn()
|
fn()
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func tcompare(t *testing.T, got, expect any) {
|
||||||
|
t.Helper()
|
||||||
|
if !reflect.DeepEqual(got, expect) {
|
||||||
|
t.Fatalf("got:\n%#v\nexpected:\n%#v", got, expect)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestAccount(t *testing.T) {
|
func TestAccount(t *testing.T) {
|
||||||
os.RemoveAll("../testdata/httpaccount/data")
|
os.RemoveAll("../testdata/httpaccount/data")
|
||||||
mox.ConfigStaticPath = filepath.FromSlash("../testdata/httpaccount/mox.conf")
|
mox.ConfigStaticPath = filepath.FromSlash("../testdata/httpaccount/mox.conf")
|
||||||
|
@ -216,7 +227,9 @@ func TestAccount(t *testing.T) {
|
||||||
|
|
||||||
api.SetPassword(ctx, "test1234")
|
api.SetPassword(ctx, "test1234")
|
||||||
|
|
||||||
account, _, _ := api.Account(ctx)
|
err = queue.Init() // For DB.
|
||||||
|
tcheck(t, err, "queue init")
|
||||||
|
account, _, _, _ := api.Account(ctx)
|
||||||
api.DestinationSave(ctx, "mjl☺@mox.example", account.Destinations["mjl☺@mox.example"], account.Destinations["mjl☺@mox.example"]) // todo: save modified value and compare it afterwards
|
api.DestinationSave(ctx, "mjl☺@mox.example", account.Destinations["mjl☺@mox.example"], account.Destinations["mjl☺@mox.example"]) // todo: save modified value and compare it afterwards
|
||||||
|
|
||||||
api.AccountSaveFullName(ctx, account.FullName+" changed") // todo: check if value was changed
|
api.AccountSaveFullName(ctx, account.FullName+" changed") // todo: check if value was changed
|
||||||
|
@ -371,6 +384,59 @@ func TestAccount(t *testing.T) {
|
||||||
testExport("/export/mail-export-mbox.tgz", false, 2)
|
testExport("/export/mail-export-mbox.tgz", false, 2)
|
||||||
testExport("/export/mail-export-mbox.zip", true, 2)
|
testExport("/export/mail-export-mbox.zip", true, 2)
|
||||||
|
|
||||||
|
sl := api.SuppressionList(ctx)
|
||||||
|
tcompare(t, len(sl), 0)
|
||||||
|
|
||||||
|
api.SuppressionAdd(ctx, "mjl@mox.example", true, "testing")
|
||||||
|
tneedErrorCode(t, "user:error", func() { api.SuppressionAdd(ctx, "mjl@mox.example", true, "testing") }) // Duplicate.
|
||||||
|
tneedErrorCode(t, "user:error", func() { api.SuppressionAdd(ctx, "bogus", true, "testing") }) // Bad address.
|
||||||
|
|
||||||
|
sl = api.SuppressionList(ctx)
|
||||||
|
tcompare(t, len(sl), 1)
|
||||||
|
|
||||||
|
api.SuppressionRemove(ctx, "mjl@mox.example")
|
||||||
|
tneedErrorCode(t, "user:error", func() { api.SuppressionRemove(ctx, "mjl@mox.example") }) // Absent.
|
||||||
|
tneedErrorCode(t, "user:error", func() { api.SuppressionRemove(ctx, "bogus") }) // Not an address.
|
||||||
|
|
||||||
|
var hooks int
|
||||||
|
hookServer := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
fmt.Fprintln(w, "ok")
|
||||||
|
hooks++
|
||||||
|
}))
|
||||||
|
defer hookServer.Close()
|
||||||
|
|
||||||
|
api.OutgoingWebhookSave(ctx, "http://localhost:1234", "Basic base64", []string{"delivered"})
|
||||||
|
api.OutgoingWebhookSave(ctx, "http://localhost:1234", "Basic base64", []string{})
|
||||||
|
tneedErrorCode(t, "user:error", func() {
|
||||||
|
api.OutgoingWebhookSave(ctx, "http://localhost:1234/outgoing", "Basic base64", []string{"bogus"})
|
||||||
|
})
|
||||||
|
tneedErrorCode(t, "user:error", func() { api.OutgoingWebhookSave(ctx, "invalid", "Basic base64", nil) })
|
||||||
|
api.OutgoingWebhookSave(ctx, "", "", nil) // Restore.
|
||||||
|
|
||||||
|
code, response, errmsg := api.OutgoingWebhookTest(ctx, hookServer.URL, "", webhook.Outgoing{})
|
||||||
|
tcompare(t, code, 200)
|
||||||
|
tcompare(t, response, "ok\n")
|
||||||
|
tcompare(t, errmsg, "")
|
||||||
|
tneedErrorCode(t, "user:error", func() { api.OutgoingWebhookTest(ctx, "bogus", "", webhook.Outgoing{}) })
|
||||||
|
|
||||||
|
api.IncomingWebhookSave(ctx, "http://localhost:1234", "Basic base64")
|
||||||
|
tneedErrorCode(t, "user:error", func() { api.IncomingWebhookSave(ctx, "invalid", "Basic base64") })
|
||||||
|
api.IncomingWebhookSave(ctx, "", "") // Restore.
|
||||||
|
|
||||||
|
code, response, errmsg = api.IncomingWebhookTest(ctx, hookServer.URL, "", webhook.Incoming{})
|
||||||
|
tcompare(t, code, 200)
|
||||||
|
tcompare(t, response, "ok\n")
|
||||||
|
tcompare(t, errmsg, "")
|
||||||
|
tneedErrorCode(t, "user:error", func() { api.IncomingWebhookTest(ctx, "bogus", "", webhook.Incoming{}) })
|
||||||
|
|
||||||
|
api.FromIDLoginAddressesSave(ctx, []string{"mjl☺@mox.example"})
|
||||||
|
api.FromIDLoginAddressesSave(ctx, []string{"mjl☺@mox.example", "mjl☺+fromid@mox.example"})
|
||||||
|
api.FromIDLoginAddressesSave(ctx, []string{})
|
||||||
|
tneedErrorCode(t, "user:error", func() { api.FromIDLoginAddressesSave(ctx, []string{"bogus@other.example"}) })
|
||||||
|
|
||||||
|
api.KeepRetiredPeriodsSave(ctx, time.Minute, time.Minute)
|
||||||
|
api.KeepRetiredPeriodsSave(ctx, 0, 0) // Restore.
|
||||||
|
|
||||||
api.Logout(ctx)
|
api.Logout(ctx)
|
||||||
tneedErrorCode(t, "server:error", func() { api.Logout(ctx) })
|
tneedErrorCode(t, "server:error", func() { api.Logout(ctx) })
|
||||||
}
|
}
|
||||||
|
|
|
@ -88,12 +88,19 @@
|
||||||
"Typewords": [
|
"Typewords": [
|
||||||
"int64"
|
"int64"
|
||||||
]
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "suppressions",
|
||||||
|
"Typewords": [
|
||||||
|
"[]",
|
||||||
|
"Suppression"
|
||||||
|
]
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"Name": "AccountSaveFullName",
|
"Name": "AccountSaveFullName",
|
||||||
"Docs": "",
|
"Docs": "AccountSaveFullName saves the full name (used as display name in email messages)\nfor the account.",
|
||||||
"Params": [
|
"Params": [
|
||||||
{
|
{
|
||||||
"Name": "fullName",
|
"Name": "fullName",
|
||||||
|
@ -154,6 +161,231 @@
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "SuppressionList",
|
||||||
|
"Docs": "SuppressionList lists the addresses on the suppression list of this account.",
|
||||||
|
"Params": [],
|
||||||
|
"Returns": [
|
||||||
|
{
|
||||||
|
"Name": "suppressions",
|
||||||
|
"Typewords": [
|
||||||
|
"[]",
|
||||||
|
"Suppression"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "SuppressionAdd",
|
||||||
|
"Docs": "SuppressionAdd adds an email address to the suppression list.",
|
||||||
|
"Params": [
|
||||||
|
{
|
||||||
|
"Name": "address",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "manual",
|
||||||
|
"Typewords": [
|
||||||
|
"bool"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "reason",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"Returns": [
|
||||||
|
{
|
||||||
|
"Name": "suppression",
|
||||||
|
"Typewords": [
|
||||||
|
"Suppression"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "SuppressionRemove",
|
||||||
|
"Docs": "SuppressionRemove removes the email address from the suppression list.",
|
||||||
|
"Params": [
|
||||||
|
{
|
||||||
|
"Name": "address",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"Returns": []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "OutgoingWebhookSave",
|
||||||
|
"Docs": "OutgoingWebhookSave saves a new webhook url for outgoing deliveries. If url\nis empty, the webhook is disabled. If authorization is non-empty it is used for\nthe Authorization header in HTTP requests. Events specifies the outgoing events\nto be delivered, or all if empty/nil.",
|
||||||
|
"Params": [
|
||||||
|
{
|
||||||
|
"Name": "url",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "authorization",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "events",
|
||||||
|
"Typewords": [
|
||||||
|
"[]",
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"Returns": []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "OutgoingWebhookTest",
|
||||||
|
"Docs": "OutgoingWebhookTest makes a test webhook call to urlStr, with optional\nauthorization. If the HTTP request is made this call will succeed also for\nnon-2xx HTTP status codes.",
|
||||||
|
"Params": [
|
||||||
|
{
|
||||||
|
"Name": "urlStr",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "authorization",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "data",
|
||||||
|
"Typewords": [
|
||||||
|
"Outgoing"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"Returns": [
|
||||||
|
{
|
||||||
|
"Name": "code",
|
||||||
|
"Typewords": [
|
||||||
|
"int32"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "response",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "errmsg",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "IncomingWebhookSave",
|
||||||
|
"Docs": "IncomingWebhookSave saves a new webhook url for incoming deliveries. If url is\nempty, the webhook is disabled. If authorization is not empty, it is used in\nthe Authorization header in requests.",
|
||||||
|
"Params": [
|
||||||
|
{
|
||||||
|
"Name": "url",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "authorization",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"Returns": []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "IncomingWebhookTest",
|
||||||
|
"Docs": "IncomingWebhookTest makes a test webhook HTTP delivery request to urlStr,\nwith optional authorization header. If the HTTP call is made, this function\nreturns non-error regardless of HTTP status code.",
|
||||||
|
"Params": [
|
||||||
|
{
|
||||||
|
"Name": "urlStr",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "authorization",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "data",
|
||||||
|
"Typewords": [
|
||||||
|
"Incoming"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"Returns": [
|
||||||
|
{
|
||||||
|
"Name": "code",
|
||||||
|
"Typewords": [
|
||||||
|
"int32"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "response",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "errmsg",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "FromIDLoginAddressesSave",
|
||||||
|
"Docs": "FromIDLoginAddressesSave saves new login addresses to enable unique SMTP\nMAIL FROM addresses (\"fromid\") for deliveries from the queue.",
|
||||||
|
"Params": [
|
||||||
|
{
|
||||||
|
"Name": "loginAddresses",
|
||||||
|
"Typewords": [
|
||||||
|
"[]",
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"Returns": []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "KeepRetiredPeriodsSave",
|
||||||
|
"Docs": "KeepRetiredPeriodsSave save periods to save retired messages and webhooks.",
|
||||||
|
"Params": [
|
||||||
|
{
|
||||||
|
"Name": "keepRetiredMessagePeriod",
|
||||||
|
"Typewords": [
|
||||||
|
"int64"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "keepRetiredWebhookPeriod",
|
||||||
|
"Typewords": [
|
||||||
|
"int64"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"Returns": []
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"Sections": [],
|
"Sections": [],
|
||||||
|
@ -162,6 +394,44 @@
|
||||||
"Name": "Account",
|
"Name": "Account",
|
||||||
"Docs": "",
|
"Docs": "",
|
||||||
"Fields": [
|
"Fields": [
|
||||||
|
{
|
||||||
|
"Name": "OutgoingWebhook",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"nullable",
|
||||||
|
"OutgoingWebhook"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "IncomingWebhook",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"nullable",
|
||||||
|
"IncomingWebhook"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "FromIDLoginAddresses",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"[]",
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "KeepRetiredMessagePeriod",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"int64"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "KeepRetiredWebhookPeriod",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"int64"
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"Name": "Domain",
|
"Name": "Domain",
|
||||||
"Docs": "",
|
"Docs": "",
|
||||||
|
@ -272,6 +542,54 @@
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"Name": "OutgoingWebhook",
|
||||||
|
"Docs": "",
|
||||||
|
"Fields": [
|
||||||
|
{
|
||||||
|
"Name": "URL",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Authorization",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Events",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"[]",
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "IncomingWebhook",
|
||||||
|
"Docs": "",
|
||||||
|
"Fields": [
|
||||||
|
{
|
||||||
|
"Name": "URL",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Authorization",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"Name": "Destination",
|
"Name": "Destination",
|
||||||
"Docs": "",
|
"Docs": "",
|
||||||
|
@ -551,6 +869,61 @@
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"Name": "Suppression",
|
||||||
|
"Docs": "Suppression is an address to which messages will not be delivered. Attempts to\ndeliver or queue will result in an immediate permanent failure to deliver.",
|
||||||
|
"Fields": [
|
||||||
|
{
|
||||||
|
"Name": "ID",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"int64"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Created",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"timestamp"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Account",
|
||||||
|
"Docs": "Suppression applies to this account only.",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "BaseAddress",
|
||||||
|
"Docs": "Unicode. Address with fictional simplified localpart: lowercase, dots removed (gmail), first token before any \"-\" or \"+\" (typical catchall separator).",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "OriginalAddress",
|
||||||
|
"Docs": "Unicode. Address that caused this suppression.",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Manual",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"bool"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Reason",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"Name": "ImportProgress",
|
"Name": "ImportProgress",
|
||||||
"Docs": "ImportProgress is returned after uploading a file to import.",
|
"Docs": "ImportProgress is returned after uploading a file to import.",
|
||||||
|
@ -563,6 +936,362 @@
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Outgoing",
|
||||||
|
"Docs": "Outgoing is the payload sent to webhook URLs for events about outgoing deliveries.",
|
||||||
|
"Fields": [
|
||||||
|
{
|
||||||
|
"Name": "Version",
|
||||||
|
"Docs": "Format of hook, currently 0.",
|
||||||
|
"Typewords": [
|
||||||
|
"int32"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Event",
|
||||||
|
"Docs": "Type of outgoing delivery event.",
|
||||||
|
"Typewords": [
|
||||||
|
"OutgoingEvent"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "DSN",
|
||||||
|
"Docs": "If this event was triggered by a delivery status notification message (DSN).",
|
||||||
|
"Typewords": [
|
||||||
|
"bool"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Suppressing",
|
||||||
|
"Docs": "If true, this failure caused the address to be added to the suppression list.",
|
||||||
|
"Typewords": [
|
||||||
|
"bool"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "QueueMsgID",
|
||||||
|
"Docs": "ID of message in queue.",
|
||||||
|
"Typewords": [
|
||||||
|
"int64"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "FromID",
|
||||||
|
"Docs": "As used in MAIL FROM, can be empty, for incoming messages.",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "MessageID",
|
||||||
|
"Docs": "From Message-Id header, as set by submitter or us, with enclosing \u003c\u003e.",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Subject",
|
||||||
|
"Docs": "Of original message.",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "WebhookQueued",
|
||||||
|
"Docs": "When webhook was first queued for delivery.",
|
||||||
|
"Typewords": [
|
||||||
|
"timestamp"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "SMTPCode",
|
||||||
|
"Docs": "Optional, for errors only, e.g. 451, 550. See package smtp for definitions.",
|
||||||
|
"Typewords": [
|
||||||
|
"int32"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "SMTPEnhancedCode",
|
||||||
|
"Docs": "Optional, for errors only, e.g. 5.1.1.",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Error",
|
||||||
|
"Docs": "Error message while delivering, or from DSN from remote, if any.",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Extra",
|
||||||
|
"Docs": "Extra fields set for message during submit, through webapi call or through X-Mox-Extra-* headers during SMTP submission.",
|
||||||
|
"Typewords": [
|
||||||
|
"{}",
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Incoming",
|
||||||
|
"Docs": "Incoming is the data sent to a webhook for incoming deliveries over SMTP.",
|
||||||
|
"Fields": [
|
||||||
|
{
|
||||||
|
"Name": "Version",
|
||||||
|
"Docs": "Format of hook, currently 0.",
|
||||||
|
"Typewords": [
|
||||||
|
"int32"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "From",
|
||||||
|
"Docs": "Message \"From\" header, typically has one address.",
|
||||||
|
"Typewords": [
|
||||||
|
"[]",
|
||||||
|
"NameAddress"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "To",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"[]",
|
||||||
|
"NameAddress"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "CC",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"[]",
|
||||||
|
"NameAddress"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "BCC",
|
||||||
|
"Docs": "Often empty, even if you were a BCC recipient.",
|
||||||
|
"Typewords": [
|
||||||
|
"[]",
|
||||||
|
"NameAddress"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "ReplyTo",
|
||||||
|
"Docs": "Optional Reply-To header, typically absent or with one address.",
|
||||||
|
"Typewords": [
|
||||||
|
"[]",
|
||||||
|
"NameAddress"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Subject",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "MessageID",
|
||||||
|
"Docs": "Of Message-Id header, typically of the form \"\u003crandom@hostname\u003e\", includes \u003c\u003e.",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "InReplyTo",
|
||||||
|
"Docs": "Optional, the message-id this message is a reply to. Includes \u003c\u003e.",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "References",
|
||||||
|
"Docs": "Optional, zero or more message-ids this message is a reply/forward/related to. The last entry is the most recent/immediate message this is a reply to. Earlier entries are the parents in a thread. Values include \u003c\u003e.",
|
||||||
|
"Typewords": [
|
||||||
|
"[]",
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Date",
|
||||||
|
"Docs": "Time in \"Date\" message header, can be different from time received.",
|
||||||
|
"Typewords": [
|
||||||
|
"nullable",
|
||||||
|
"timestamp"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Text",
|
||||||
|
"Docs": "Contents of text/plain and/or text/html part (if any), with \"\\n\" line-endings, converted from \"\\r\\n\". Values are truncated to 1MB (1024*1024 bytes). Use webapi MessagePartGet to retrieve the full part data.",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "HTML",
|
||||||
|
"Docs": "",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Structure",
|
||||||
|
"Docs": "Parsed form of MIME message.",
|
||||||
|
"Typewords": [
|
||||||
|
"Structure"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Meta",
|
||||||
|
"Docs": "Details about message in storage, and SMTP transaction details.",
|
||||||
|
"Typewords": [
|
||||||
|
"IncomingMeta"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "NameAddress",
|
||||||
|
"Docs": "",
|
||||||
|
"Fields": [
|
||||||
|
{
|
||||||
|
"Name": "Name",
|
||||||
|
"Docs": "Optional, human-readable \"display name\" of the addressee.",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Address",
|
||||||
|
"Docs": "Required, email address.",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Structure",
|
||||||
|
"Docs": "",
|
||||||
|
"Fields": [
|
||||||
|
{
|
||||||
|
"Name": "ContentType",
|
||||||
|
"Docs": "Lower case, e.g. text/plain.",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "ContentTypeParams",
|
||||||
|
"Docs": "Lower case keys, original case values, e.g. {\"charset\": \"UTF-8\"}.",
|
||||||
|
"Typewords": [
|
||||||
|
"{}",
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "ContentID",
|
||||||
|
"Docs": "Can be empty. Otherwise, should be a value wrapped in \u003c\u003e's. For use in HTML, referenced as URI `cid:...`.",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "DecodedSize",
|
||||||
|
"Docs": "Size of content after decoding content-transfer-encoding. For text and HTML parts, this can be larger than the data returned since this size includes \\r\\n line endings.",
|
||||||
|
"Typewords": [
|
||||||
|
"int64"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Parts",
|
||||||
|
"Docs": "Subparts of a multipart message, possibly recursive.",
|
||||||
|
"Typewords": [
|
||||||
|
"[]",
|
||||||
|
"Structure"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "IncomingMeta",
|
||||||
|
"Docs": "",
|
||||||
|
"Fields": [
|
||||||
|
{
|
||||||
|
"Name": "MsgID",
|
||||||
|
"Docs": "ID of message in storage, and to use in webapi calls like MessageGet.",
|
||||||
|
"Typewords": [
|
||||||
|
"int64"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "MailFrom",
|
||||||
|
"Docs": "Address used during SMTP \"MAIL FROM\" command.",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "MailFromValidated",
|
||||||
|
"Docs": "Whether SMTP MAIL FROM address was SPF-validated.",
|
||||||
|
"Typewords": [
|
||||||
|
"bool"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "MsgFromValidated",
|
||||||
|
"Docs": "Whether address in message \"From\"-header was DMARC(-like) validated.",
|
||||||
|
"Typewords": [
|
||||||
|
"bool"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "RcptTo",
|
||||||
|
"Docs": "SMTP RCPT TO address used in SMTP.",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "DKIMVerifiedDomains",
|
||||||
|
"Docs": "Verified domains from DKIM-signature in message. Can be different domain than used in addresses.",
|
||||||
|
"Typewords": [
|
||||||
|
"[]",
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "RemoteIP",
|
||||||
|
"Docs": "Where the message was delivered from.",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Received",
|
||||||
|
"Docs": "When message was received, may be different from the Date header.",
|
||||||
|
"Typewords": [
|
||||||
|
"timestamp"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "MailboxName",
|
||||||
|
"Docs": "Mailbox where message was delivered to, based on configured rules. Defaults to \"Inbox\".",
|
||||||
|
"Typewords": [
|
||||||
|
"string"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "Automated",
|
||||||
|
"Docs": "Whether this message was automated and should not receive automated replies. E.g. out of office or mailing list messages.",
|
||||||
|
"Typewords": [
|
||||||
|
"bool"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"Ints": [],
|
"Ints": [],
|
||||||
|
@ -571,6 +1300,52 @@
|
||||||
"Name": "CSRFToken",
|
"Name": "CSRFToken",
|
||||||
"Docs": "",
|
"Docs": "",
|
||||||
"Values": null
|
"Values": null
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "OutgoingEvent",
|
||||||
|
"Docs": "OutgoingEvent is an activity for an outgoing delivery. Either generated by the\nqueue, or through an incoming DSN (delivery status notification) message.",
|
||||||
|
"Values": [
|
||||||
|
{
|
||||||
|
"Name": "EventDelivered",
|
||||||
|
"Value": "delivered",
|
||||||
|
"Docs": "Message was accepted by a next-hop server. This does not necessarily mean the\nmessage has been delivered in the mailbox of the user."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "EventSuppressed",
|
||||||
|
"Value": "suppressed",
|
||||||
|
"Docs": "Outbound delivery was suppressed because the recipient address is on the\nsuppression list of the account, or a simplified/base variant of the address is."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "EventDelayed",
|
||||||
|
"Value": "delayed",
|
||||||
|
"Docs": "A delivery attempt failed but delivery will be retried again later."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "EventFailed",
|
||||||
|
"Value": "failed",
|
||||||
|
"Docs": "Delivery of the message failed and will not be tried again. Also see the\n\"Suppressing\" field of [Outgoing]."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "EventRelayed",
|
||||||
|
"Value": "relayed",
|
||||||
|
"Docs": "Message was relayed into a system that does not generate DSNs. Should only\nhappen when explicitly requested."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "EventExpanded",
|
||||||
|
"Value": "expanded",
|
||||||
|
"Docs": "Message was accepted and is being delivered to multiple recipients (e.g. the\naddress was an alias/list), which may generate more DSNs."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "EventCanceled",
|
||||||
|
"Value": "canceled",
|
||||||
|
"Docs": "Message was removed from the queue, e.g. canceled by admin/user."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "EventUnrecognized",
|
||||||
|
"Value": "unrecognized",
|
||||||
|
"Docs": "An incoming message was received that was either a DSN with an unknown event\ntype (\"action\"), or an incoming non-DSN-message was received for the unique\nper-outgoing-message address used for sending."
|
||||||
|
}
|
||||||
|
]
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"SherpaVersion": 0,
|
"SherpaVersion": 0,
|
||||||
|
|
|
@ -3,6 +3,11 @@
|
||||||
namespace api {
|
namespace api {
|
||||||
|
|
||||||
export interface Account {
|
export interface Account {
|
||||||
|
OutgoingWebhook?: OutgoingWebhook | null
|
||||||
|
IncomingWebhook?: IncomingWebhook | null
|
||||||
|
FromIDLoginAddresses?: string[] | null
|
||||||
|
KeepRetiredMessagePeriod: number
|
||||||
|
KeepRetiredWebhookPeriod: number
|
||||||
Domain: string
|
Domain: string
|
||||||
Description: string
|
Description: string
|
||||||
FullName: string
|
FullName: string
|
||||||
|
@ -20,6 +25,17 @@ export interface Account {
|
||||||
DNSDomain: Domain // Parsed form of Domain.
|
DNSDomain: Domain // Parsed form of Domain.
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export interface OutgoingWebhook {
|
||||||
|
URL: string
|
||||||
|
Authorization: string
|
||||||
|
Events?: string[] | null
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface IncomingWebhook {
|
||||||
|
URL: string
|
||||||
|
Authorization: string
|
||||||
|
}
|
||||||
|
|
||||||
export interface Destination {
|
export interface Destination {
|
||||||
Mailbox: string
|
Mailbox: string
|
||||||
Rulesets?: Ruleset[] | null
|
Rulesets?: Ruleset[] | null
|
||||||
|
@ -78,18 +94,120 @@ export interface Route {
|
||||||
ToDomainASCII?: string[] | null
|
ToDomainASCII?: string[] | null
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Suppression is an address to which messages will not be delivered. Attempts to
|
||||||
|
// deliver or queue will result in an immediate permanent failure to deliver.
|
||||||
|
export interface Suppression {
|
||||||
|
ID: number
|
||||||
|
Created: Date
|
||||||
|
Account: string // Suppression applies to this account only.
|
||||||
|
BaseAddress: string // Unicode. Address with fictional simplified localpart: lowercase, dots removed (gmail), first token before any "-" or "+" (typical catchall separator).
|
||||||
|
OriginalAddress: string // Unicode. Address that caused this suppression.
|
||||||
|
Manual: boolean
|
||||||
|
Reason: string
|
||||||
|
}
|
||||||
|
|
||||||
// ImportProgress is returned after uploading a file to import.
|
// ImportProgress is returned after uploading a file to import.
|
||||||
export interface ImportProgress {
|
export interface ImportProgress {
|
||||||
Token: string // For fetching progress, or cancelling an import.
|
Token: string // For fetching progress, or cancelling an import.
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Outgoing is the payload sent to webhook URLs for events about outgoing deliveries.
|
||||||
|
export interface Outgoing {
|
||||||
|
Version: number // Format of hook, currently 0.
|
||||||
|
Event: OutgoingEvent // Type of outgoing delivery event.
|
||||||
|
DSN: boolean // If this event was triggered by a delivery status notification message (DSN).
|
||||||
|
Suppressing: boolean // If true, this failure caused the address to be added to the suppression list.
|
||||||
|
QueueMsgID: number // ID of message in queue.
|
||||||
|
FromID: string // As used in MAIL FROM, can be empty, for incoming messages.
|
||||||
|
MessageID: string // From Message-Id header, as set by submitter or us, with enclosing <>.
|
||||||
|
Subject: string // Of original message.
|
||||||
|
WebhookQueued: Date // When webhook was first queued for delivery.
|
||||||
|
SMTPCode: number // Optional, for errors only, e.g. 451, 550. See package smtp for definitions.
|
||||||
|
SMTPEnhancedCode: string // Optional, for errors only, e.g. 5.1.1.
|
||||||
|
Error: string // Error message while delivering, or from DSN from remote, if any.
|
||||||
|
Extra?: { [key: string]: string } // Extra fields set for message during submit, through webapi call or through X-Mox-Extra-* headers during SMTP submission.
|
||||||
|
}
|
||||||
|
|
||||||
|
// Incoming is the data sent to a webhook for incoming deliveries over SMTP.
|
||||||
|
export interface Incoming {
|
||||||
|
Version: number // Format of hook, currently 0.
|
||||||
|
From?: NameAddress[] | null // Message "From" header, typically has one address.
|
||||||
|
To?: NameAddress[] | null
|
||||||
|
CC?: NameAddress[] | null
|
||||||
|
BCC?: NameAddress[] | null // Often empty, even if you were a BCC recipient.
|
||||||
|
ReplyTo?: NameAddress[] | null // Optional Reply-To header, typically absent or with one address.
|
||||||
|
Subject: string
|
||||||
|
MessageID: string // Of Message-Id header, typically of the form "<random@hostname>", includes <>.
|
||||||
|
InReplyTo: string // Optional, the message-id this message is a reply to. Includes <>.
|
||||||
|
References?: string[] | null // Optional, zero or more message-ids this message is a reply/forward/related to. The last entry is the most recent/immediate message this is a reply to. Earlier entries are the parents in a thread. Values include <>.
|
||||||
|
Date?: Date | null // Time in "Date" message header, can be different from time received.
|
||||||
|
Text: string // Contents of text/plain and/or text/html part (if any), with "\n" line-endings, converted from "\r\n". Values are truncated to 1MB (1024*1024 bytes). Use webapi MessagePartGet to retrieve the full part data.
|
||||||
|
HTML: string
|
||||||
|
Structure: Structure // Parsed form of MIME message.
|
||||||
|
Meta: IncomingMeta // Details about message in storage, and SMTP transaction details.
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface NameAddress {
|
||||||
|
Name: string // Optional, human-readable "display name" of the addressee.
|
||||||
|
Address: string // Required, email address.
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Structure {
|
||||||
|
ContentType: string // Lower case, e.g. text/plain.
|
||||||
|
ContentTypeParams?: { [key: string]: string } // Lower case keys, original case values, e.g. {"charset": "UTF-8"}.
|
||||||
|
ContentID: string // Can be empty. Otherwise, should be a value wrapped in <>'s. For use in HTML, referenced as URI `cid:...`.
|
||||||
|
DecodedSize: number // Size of content after decoding content-transfer-encoding. For text and HTML parts, this can be larger than the data returned since this size includes \r\n line endings.
|
||||||
|
Parts?: Structure[] | null // Subparts of a multipart message, possibly recursive.
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface IncomingMeta {
|
||||||
|
MsgID: number // ID of message in storage, and to use in webapi calls like MessageGet.
|
||||||
|
MailFrom: string // Address used during SMTP "MAIL FROM" command.
|
||||||
|
MailFromValidated: boolean // Whether SMTP MAIL FROM address was SPF-validated.
|
||||||
|
MsgFromValidated: boolean // Whether address in message "From"-header was DMARC(-like) validated.
|
||||||
|
RcptTo: string // SMTP RCPT TO address used in SMTP.
|
||||||
|
DKIMVerifiedDomains?: string[] | null // Verified domains from DKIM-signature in message. Can be different domain than used in addresses.
|
||||||
|
RemoteIP: string // Where the message was delivered from.
|
||||||
|
Received: Date // When message was received, may be different from the Date header.
|
||||||
|
MailboxName: string // Mailbox where message was delivered to, based on configured rules. Defaults to "Inbox".
|
||||||
|
Automated: boolean // Whether this message was automated and should not receive automated replies. E.g. out of office or mailing list messages.
|
||||||
|
}
|
||||||
|
|
||||||
export type CSRFToken = string
|
export type CSRFToken = string
|
||||||
|
|
||||||
export const structTypes: {[typename: string]: boolean} = {"Account":true,"AutomaticJunkFlags":true,"Destination":true,"Domain":true,"ImportProgress":true,"JunkFilter":true,"Route":true,"Ruleset":true,"SubjectPass":true}
|
// OutgoingEvent is an activity for an outgoing delivery. Either generated by the
|
||||||
export const stringsTypes: {[typename: string]: boolean} = {"CSRFToken":true}
|
// queue, or through an incoming DSN (delivery status notification) message.
|
||||||
|
export enum OutgoingEvent {
|
||||||
|
// Message was accepted by a next-hop server. This does not necessarily mean the
|
||||||
|
// message has been delivered in the mailbox of the user.
|
||||||
|
EventDelivered = "delivered",
|
||||||
|
// Outbound delivery was suppressed because the recipient address is on the
|
||||||
|
// suppression list of the account, or a simplified/base variant of the address is.
|
||||||
|
EventSuppressed = "suppressed",
|
||||||
|
EventDelayed = "delayed", // A delivery attempt failed but delivery will be retried again later.
|
||||||
|
// Delivery of the message failed and will not be tried again. Also see the
|
||||||
|
// "Suppressing" field of [Outgoing].
|
||||||
|
EventFailed = "failed",
|
||||||
|
// Message was relayed into a system that does not generate DSNs. Should only
|
||||||
|
// happen when explicitly requested.
|
||||||
|
EventRelayed = "relayed",
|
||||||
|
// Message was accepted and is being delivered to multiple recipients (e.g. the
|
||||||
|
// address was an alias/list), which may generate more DSNs.
|
||||||
|
EventExpanded = "expanded",
|
||||||
|
EventCanceled = "canceled", // Message was removed from the queue, e.g. canceled by admin/user.
|
||||||
|
// An incoming message was received that was either a DSN with an unknown event
|
||||||
|
// type ("action"), or an incoming non-DSN-message was received for the unique
|
||||||
|
// per-outgoing-message address used for sending.
|
||||||
|
EventUnrecognized = "unrecognized",
|
||||||
|
}
|
||||||
|
|
||||||
|
export const structTypes: {[typename: string]: boolean} = {"Account":true,"AutomaticJunkFlags":true,"Destination":true,"Domain":true,"ImportProgress":true,"Incoming":true,"IncomingMeta":true,"IncomingWebhook":true,"JunkFilter":true,"NameAddress":true,"Outgoing":true,"OutgoingWebhook":true,"Route":true,"Ruleset":true,"Structure":true,"SubjectPass":true,"Suppression":true}
|
||||||
|
export const stringsTypes: {[typename: string]: boolean} = {"CSRFToken":true,"OutgoingEvent":true}
|
||||||
export const intsTypes: {[typename: string]: boolean} = {}
|
export const intsTypes: {[typename: string]: boolean} = {}
|
||||||
export const types: TypenameMap = {
|
export const types: TypenameMap = {
|
||||||
"Account": {"Name":"Account","Docs":"","Fields":[{"Name":"Domain","Docs":"","Typewords":["string"]},{"Name":"Description","Docs":"","Typewords":["string"]},{"Name":"FullName","Docs":"","Typewords":["string"]},{"Name":"Destinations","Docs":"","Typewords":["{}","Destination"]},{"Name":"SubjectPass","Docs":"","Typewords":["SubjectPass"]},{"Name":"QuotaMessageSize","Docs":"","Typewords":["int64"]},{"Name":"RejectsMailbox","Docs":"","Typewords":["string"]},{"Name":"KeepRejects","Docs":"","Typewords":["bool"]},{"Name":"AutomaticJunkFlags","Docs":"","Typewords":["AutomaticJunkFlags"]},{"Name":"JunkFilter","Docs":"","Typewords":["nullable","JunkFilter"]},{"Name":"MaxOutgoingMessagesPerDay","Docs":"","Typewords":["int32"]},{"Name":"MaxFirstTimeRecipientsPerDay","Docs":"","Typewords":["int32"]},{"Name":"NoFirstTimeSenderDelay","Docs":"","Typewords":["bool"]},{"Name":"Routes","Docs":"","Typewords":["[]","Route"]},{"Name":"DNSDomain","Docs":"","Typewords":["Domain"]}]},
|
"Account": {"Name":"Account","Docs":"","Fields":[{"Name":"OutgoingWebhook","Docs":"","Typewords":["nullable","OutgoingWebhook"]},{"Name":"IncomingWebhook","Docs":"","Typewords":["nullable","IncomingWebhook"]},{"Name":"FromIDLoginAddresses","Docs":"","Typewords":["[]","string"]},{"Name":"KeepRetiredMessagePeriod","Docs":"","Typewords":["int64"]},{"Name":"KeepRetiredWebhookPeriod","Docs":"","Typewords":["int64"]},{"Name":"Domain","Docs":"","Typewords":["string"]},{"Name":"Description","Docs":"","Typewords":["string"]},{"Name":"FullName","Docs":"","Typewords":["string"]},{"Name":"Destinations","Docs":"","Typewords":["{}","Destination"]},{"Name":"SubjectPass","Docs":"","Typewords":["SubjectPass"]},{"Name":"QuotaMessageSize","Docs":"","Typewords":["int64"]},{"Name":"RejectsMailbox","Docs":"","Typewords":["string"]},{"Name":"KeepRejects","Docs":"","Typewords":["bool"]},{"Name":"AutomaticJunkFlags","Docs":"","Typewords":["AutomaticJunkFlags"]},{"Name":"JunkFilter","Docs":"","Typewords":["nullable","JunkFilter"]},{"Name":"MaxOutgoingMessagesPerDay","Docs":"","Typewords":["int32"]},{"Name":"MaxFirstTimeRecipientsPerDay","Docs":"","Typewords":["int32"]},{"Name":"NoFirstTimeSenderDelay","Docs":"","Typewords":["bool"]},{"Name":"Routes","Docs":"","Typewords":["[]","Route"]},{"Name":"DNSDomain","Docs":"","Typewords":["Domain"]}]},
|
||||||
|
"OutgoingWebhook": {"Name":"OutgoingWebhook","Docs":"","Fields":[{"Name":"URL","Docs":"","Typewords":["string"]},{"Name":"Authorization","Docs":"","Typewords":["string"]},{"Name":"Events","Docs":"","Typewords":["[]","string"]}]},
|
||||||
|
"IncomingWebhook": {"Name":"IncomingWebhook","Docs":"","Fields":[{"Name":"URL","Docs":"","Typewords":["string"]},{"Name":"Authorization","Docs":"","Typewords":["string"]}]},
|
||||||
"Destination": {"Name":"Destination","Docs":"","Fields":[{"Name":"Mailbox","Docs":"","Typewords":["string"]},{"Name":"Rulesets","Docs":"","Typewords":["[]","Ruleset"]},{"Name":"FullName","Docs":"","Typewords":["string"]}]},
|
"Destination": {"Name":"Destination","Docs":"","Fields":[{"Name":"Mailbox","Docs":"","Typewords":["string"]},{"Name":"Rulesets","Docs":"","Typewords":["[]","Ruleset"]},{"Name":"FullName","Docs":"","Typewords":["string"]}]},
|
||||||
"Ruleset": {"Name":"Ruleset","Docs":"","Fields":[{"Name":"SMTPMailFromRegexp","Docs":"","Typewords":["string"]},{"Name":"VerifiedDomain","Docs":"","Typewords":["string"]},{"Name":"HeadersRegexp","Docs":"","Typewords":["{}","string"]},{"Name":"IsForward","Docs":"","Typewords":["bool"]},{"Name":"ListAllowDomain","Docs":"","Typewords":["string"]},{"Name":"AcceptRejectsToMailbox","Docs":"","Typewords":["string"]},{"Name":"Mailbox","Docs":"","Typewords":["string"]},{"Name":"VerifiedDNSDomain","Docs":"","Typewords":["Domain"]},{"Name":"ListAllowDNSDomain","Docs":"","Typewords":["Domain"]}]},
|
"Ruleset": {"Name":"Ruleset","Docs":"","Fields":[{"Name":"SMTPMailFromRegexp","Docs":"","Typewords":["string"]},{"Name":"VerifiedDomain","Docs":"","Typewords":["string"]},{"Name":"HeadersRegexp","Docs":"","Typewords":["{}","string"]},{"Name":"IsForward","Docs":"","Typewords":["bool"]},{"Name":"ListAllowDomain","Docs":"","Typewords":["string"]},{"Name":"AcceptRejectsToMailbox","Docs":"","Typewords":["string"]},{"Name":"Mailbox","Docs":"","Typewords":["string"]},{"Name":"VerifiedDNSDomain","Docs":"","Typewords":["Domain"]},{"Name":"ListAllowDNSDomain","Docs":"","Typewords":["Domain"]}]},
|
||||||
"Domain": {"Name":"Domain","Docs":"","Fields":[{"Name":"ASCII","Docs":"","Typewords":["string"]},{"Name":"Unicode","Docs":"","Typewords":["string"]}]},
|
"Domain": {"Name":"Domain","Docs":"","Fields":[{"Name":"ASCII","Docs":"","Typewords":["string"]},{"Name":"Unicode","Docs":"","Typewords":["string"]}]},
|
||||||
|
@ -97,12 +215,21 @@ export const types: TypenameMap = {
|
||||||
"AutomaticJunkFlags": {"Name":"AutomaticJunkFlags","Docs":"","Fields":[{"Name":"Enabled","Docs":"","Typewords":["bool"]},{"Name":"JunkMailboxRegexp","Docs":"","Typewords":["string"]},{"Name":"NeutralMailboxRegexp","Docs":"","Typewords":["string"]},{"Name":"NotJunkMailboxRegexp","Docs":"","Typewords":["string"]}]},
|
"AutomaticJunkFlags": {"Name":"AutomaticJunkFlags","Docs":"","Fields":[{"Name":"Enabled","Docs":"","Typewords":["bool"]},{"Name":"JunkMailboxRegexp","Docs":"","Typewords":["string"]},{"Name":"NeutralMailboxRegexp","Docs":"","Typewords":["string"]},{"Name":"NotJunkMailboxRegexp","Docs":"","Typewords":["string"]}]},
|
||||||
"JunkFilter": {"Name":"JunkFilter","Docs":"","Fields":[{"Name":"Threshold","Docs":"","Typewords":["float64"]},{"Name":"Onegrams","Docs":"","Typewords":["bool"]},{"Name":"Twograms","Docs":"","Typewords":["bool"]},{"Name":"Threegrams","Docs":"","Typewords":["bool"]},{"Name":"MaxPower","Docs":"","Typewords":["float64"]},{"Name":"TopWords","Docs":"","Typewords":["int32"]},{"Name":"IgnoreWords","Docs":"","Typewords":["float64"]},{"Name":"RareWords","Docs":"","Typewords":["int32"]}]},
|
"JunkFilter": {"Name":"JunkFilter","Docs":"","Fields":[{"Name":"Threshold","Docs":"","Typewords":["float64"]},{"Name":"Onegrams","Docs":"","Typewords":["bool"]},{"Name":"Twograms","Docs":"","Typewords":["bool"]},{"Name":"Threegrams","Docs":"","Typewords":["bool"]},{"Name":"MaxPower","Docs":"","Typewords":["float64"]},{"Name":"TopWords","Docs":"","Typewords":["int32"]},{"Name":"IgnoreWords","Docs":"","Typewords":["float64"]},{"Name":"RareWords","Docs":"","Typewords":["int32"]}]},
|
||||||
"Route": {"Name":"Route","Docs":"","Fields":[{"Name":"FromDomain","Docs":"","Typewords":["[]","string"]},{"Name":"ToDomain","Docs":"","Typewords":["[]","string"]},{"Name":"MinimumAttempts","Docs":"","Typewords":["int32"]},{"Name":"Transport","Docs":"","Typewords":["string"]},{"Name":"FromDomainASCII","Docs":"","Typewords":["[]","string"]},{"Name":"ToDomainASCII","Docs":"","Typewords":["[]","string"]}]},
|
"Route": {"Name":"Route","Docs":"","Fields":[{"Name":"FromDomain","Docs":"","Typewords":["[]","string"]},{"Name":"ToDomain","Docs":"","Typewords":["[]","string"]},{"Name":"MinimumAttempts","Docs":"","Typewords":["int32"]},{"Name":"Transport","Docs":"","Typewords":["string"]},{"Name":"FromDomainASCII","Docs":"","Typewords":["[]","string"]},{"Name":"ToDomainASCII","Docs":"","Typewords":["[]","string"]}]},
|
||||||
|
"Suppression": {"Name":"Suppression","Docs":"","Fields":[{"Name":"ID","Docs":"","Typewords":["int64"]},{"Name":"Created","Docs":"","Typewords":["timestamp"]},{"Name":"Account","Docs":"","Typewords":["string"]},{"Name":"BaseAddress","Docs":"","Typewords":["string"]},{"Name":"OriginalAddress","Docs":"","Typewords":["string"]},{"Name":"Manual","Docs":"","Typewords":["bool"]},{"Name":"Reason","Docs":"","Typewords":["string"]}]},
|
||||||
"ImportProgress": {"Name":"ImportProgress","Docs":"","Fields":[{"Name":"Token","Docs":"","Typewords":["string"]}]},
|
"ImportProgress": {"Name":"ImportProgress","Docs":"","Fields":[{"Name":"Token","Docs":"","Typewords":["string"]}]},
|
||||||
|
"Outgoing": {"Name":"Outgoing","Docs":"","Fields":[{"Name":"Version","Docs":"","Typewords":["int32"]},{"Name":"Event","Docs":"","Typewords":["OutgoingEvent"]},{"Name":"DSN","Docs":"","Typewords":["bool"]},{"Name":"Suppressing","Docs":"","Typewords":["bool"]},{"Name":"QueueMsgID","Docs":"","Typewords":["int64"]},{"Name":"FromID","Docs":"","Typewords":["string"]},{"Name":"MessageID","Docs":"","Typewords":["string"]},{"Name":"Subject","Docs":"","Typewords":["string"]},{"Name":"WebhookQueued","Docs":"","Typewords":["timestamp"]},{"Name":"SMTPCode","Docs":"","Typewords":["int32"]},{"Name":"SMTPEnhancedCode","Docs":"","Typewords":["string"]},{"Name":"Error","Docs":"","Typewords":["string"]},{"Name":"Extra","Docs":"","Typewords":["{}","string"]}]},
|
||||||
|
"Incoming": {"Name":"Incoming","Docs":"","Fields":[{"Name":"Version","Docs":"","Typewords":["int32"]},{"Name":"From","Docs":"","Typewords":["[]","NameAddress"]},{"Name":"To","Docs":"","Typewords":["[]","NameAddress"]},{"Name":"CC","Docs":"","Typewords":["[]","NameAddress"]},{"Name":"BCC","Docs":"","Typewords":["[]","NameAddress"]},{"Name":"ReplyTo","Docs":"","Typewords":["[]","NameAddress"]},{"Name":"Subject","Docs":"","Typewords":["string"]},{"Name":"MessageID","Docs":"","Typewords":["string"]},{"Name":"InReplyTo","Docs":"","Typewords":["string"]},{"Name":"References","Docs":"","Typewords":["[]","string"]},{"Name":"Date","Docs":"","Typewords":["nullable","timestamp"]},{"Name":"Text","Docs":"","Typewords":["string"]},{"Name":"HTML","Docs":"","Typewords":["string"]},{"Name":"Structure","Docs":"","Typewords":["Structure"]},{"Name":"Meta","Docs":"","Typewords":["IncomingMeta"]}]},
|
||||||
|
"NameAddress": {"Name":"NameAddress","Docs":"","Fields":[{"Name":"Name","Docs":"","Typewords":["string"]},{"Name":"Address","Docs":"","Typewords":["string"]}]},
|
||||||
|
"Structure": {"Name":"Structure","Docs":"","Fields":[{"Name":"ContentType","Docs":"","Typewords":["string"]},{"Name":"ContentTypeParams","Docs":"","Typewords":["{}","string"]},{"Name":"ContentID","Docs":"","Typewords":["string"]},{"Name":"DecodedSize","Docs":"","Typewords":["int64"]},{"Name":"Parts","Docs":"","Typewords":["[]","Structure"]}]},
|
||||||
|
"IncomingMeta": {"Name":"IncomingMeta","Docs":"","Fields":[{"Name":"MsgID","Docs":"","Typewords":["int64"]},{"Name":"MailFrom","Docs":"","Typewords":["string"]},{"Name":"MailFromValidated","Docs":"","Typewords":["bool"]},{"Name":"MsgFromValidated","Docs":"","Typewords":["bool"]},{"Name":"RcptTo","Docs":"","Typewords":["string"]},{"Name":"DKIMVerifiedDomains","Docs":"","Typewords":["[]","string"]},{"Name":"RemoteIP","Docs":"","Typewords":["string"]},{"Name":"Received","Docs":"","Typewords":["timestamp"]},{"Name":"MailboxName","Docs":"","Typewords":["string"]},{"Name":"Automated","Docs":"","Typewords":["bool"]}]},
|
||||||
"CSRFToken": {"Name":"CSRFToken","Docs":"","Values":null},
|
"CSRFToken": {"Name":"CSRFToken","Docs":"","Values":null},
|
||||||
|
"OutgoingEvent": {"Name":"OutgoingEvent","Docs":"","Values":[{"Name":"EventDelivered","Value":"delivered","Docs":""},{"Name":"EventSuppressed","Value":"suppressed","Docs":""},{"Name":"EventDelayed","Value":"delayed","Docs":""},{"Name":"EventFailed","Value":"failed","Docs":""},{"Name":"EventRelayed","Value":"relayed","Docs":""},{"Name":"EventExpanded","Value":"expanded","Docs":""},{"Name":"EventCanceled","Value":"canceled","Docs":""},{"Name":"EventUnrecognized","Value":"unrecognized","Docs":""}]},
|
||||||
}
|
}
|
||||||
|
|
||||||
export const parser = {
|
export const parser = {
|
||||||
Account: (v: any) => parse("Account", v) as Account,
|
Account: (v: any) => parse("Account", v) as Account,
|
||||||
|
OutgoingWebhook: (v: any) => parse("OutgoingWebhook", v) as OutgoingWebhook,
|
||||||
|
IncomingWebhook: (v: any) => parse("IncomingWebhook", v) as IncomingWebhook,
|
||||||
Destination: (v: any) => parse("Destination", v) as Destination,
|
Destination: (v: any) => parse("Destination", v) as Destination,
|
||||||
Ruleset: (v: any) => parse("Ruleset", v) as Ruleset,
|
Ruleset: (v: any) => parse("Ruleset", v) as Ruleset,
|
||||||
Domain: (v: any) => parse("Domain", v) as Domain,
|
Domain: (v: any) => parse("Domain", v) as Domain,
|
||||||
|
@ -110,8 +237,15 @@ export const parser = {
|
||||||
AutomaticJunkFlags: (v: any) => parse("AutomaticJunkFlags", v) as AutomaticJunkFlags,
|
AutomaticJunkFlags: (v: any) => parse("AutomaticJunkFlags", v) as AutomaticJunkFlags,
|
||||||
JunkFilter: (v: any) => parse("JunkFilter", v) as JunkFilter,
|
JunkFilter: (v: any) => parse("JunkFilter", v) as JunkFilter,
|
||||||
Route: (v: any) => parse("Route", v) as Route,
|
Route: (v: any) => parse("Route", v) as Route,
|
||||||
|
Suppression: (v: any) => parse("Suppression", v) as Suppression,
|
||||||
ImportProgress: (v: any) => parse("ImportProgress", v) as ImportProgress,
|
ImportProgress: (v: any) => parse("ImportProgress", v) as ImportProgress,
|
||||||
|
Outgoing: (v: any) => parse("Outgoing", v) as Outgoing,
|
||||||
|
Incoming: (v: any) => parse("Incoming", v) as Incoming,
|
||||||
|
NameAddress: (v: any) => parse("NameAddress", v) as NameAddress,
|
||||||
|
Structure: (v: any) => parse("Structure", v) as Structure,
|
||||||
|
IncomingMeta: (v: any) => parse("IncomingMeta", v) as IncomingMeta,
|
||||||
CSRFToken: (v: any) => parse("CSRFToken", v) as CSRFToken,
|
CSRFToken: (v: any) => parse("CSRFToken", v) as CSRFToken,
|
||||||
|
OutgoingEvent: (v: any) => parse("OutgoingEvent", v) as OutgoingEvent,
|
||||||
}
|
}
|
||||||
|
|
||||||
// Account exports web API functions for the account web interface. All its
|
// Account exports web API functions for the account web interface. All its
|
||||||
|
@ -187,14 +321,16 @@ export class Client {
|
||||||
// Account returns information about the account.
|
// Account returns information about the account.
|
||||||
// StorageUsed is the sum of the sizes of all messages, in bytes.
|
// StorageUsed is the sum of the sizes of all messages, in bytes.
|
||||||
// StorageLimit is the maximum storage that can be used, or 0 if there is no limit.
|
// StorageLimit is the maximum storage that can be used, or 0 if there is no limit.
|
||||||
async Account(): Promise<[Account, number, number]> {
|
async Account(): Promise<[Account, number, number, Suppression[] | null]> {
|
||||||
const fn: string = "Account"
|
const fn: string = "Account"
|
||||||
const paramTypes: string[][] = []
|
const paramTypes: string[][] = []
|
||||||
const returnTypes: string[][] = [["Account"],["int64"],["int64"]]
|
const returnTypes: string[][] = [["Account"],["int64"],["int64"],["[]","Suppression"]]
|
||||||
const params: any[] = []
|
const params: any[] = []
|
||||||
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as [Account, number, number]
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as [Account, number, number, Suppression[] | null]
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// AccountSaveFullName saves the full name (used as display name in email messages)
|
||||||
|
// for the account.
|
||||||
async AccountSaveFullName(fullName: string): Promise<void> {
|
async AccountSaveFullName(fullName: string): Promise<void> {
|
||||||
const fn: string = "AccountSaveFullName"
|
const fn: string = "AccountSaveFullName"
|
||||||
const paramTypes: string[][] = [["string"]]
|
const paramTypes: string[][] = [["string"]]
|
||||||
|
@ -232,6 +368,97 @@ export class Client {
|
||||||
const params: any[] = []
|
const params: any[] = []
|
||||||
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as ImportProgress
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as ImportProgress
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// SuppressionList lists the addresses on the suppression list of this account.
|
||||||
|
async SuppressionList(): Promise<Suppression[] | null> {
|
||||||
|
const fn: string = "SuppressionList"
|
||||||
|
const paramTypes: string[][] = []
|
||||||
|
const returnTypes: string[][] = [["[]","Suppression"]]
|
||||||
|
const params: any[] = []
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as Suppression[] | null
|
||||||
|
}
|
||||||
|
|
||||||
|
// SuppressionAdd adds an email address to the suppression list.
|
||||||
|
async SuppressionAdd(address: string, manual: boolean, reason: string): Promise<Suppression> {
|
||||||
|
const fn: string = "SuppressionAdd"
|
||||||
|
const paramTypes: string[][] = [["string"],["bool"],["string"]]
|
||||||
|
const returnTypes: string[][] = [["Suppression"]]
|
||||||
|
const params: any[] = [address, manual, reason]
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as Suppression
|
||||||
|
}
|
||||||
|
|
||||||
|
// SuppressionRemove removes the email address from the suppression list.
|
||||||
|
async SuppressionRemove(address: string): Promise<void> {
|
||||||
|
const fn: string = "SuppressionRemove"
|
||||||
|
const paramTypes: string[][] = [["string"]]
|
||||||
|
const returnTypes: string[][] = []
|
||||||
|
const params: any[] = [address]
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as void
|
||||||
|
}
|
||||||
|
|
||||||
|
// OutgoingWebhookSave saves a new webhook url for outgoing deliveries. If url
|
||||||
|
// is empty, the webhook is disabled. If authorization is non-empty it is used for
|
||||||
|
// the Authorization header in HTTP requests. Events specifies the outgoing events
|
||||||
|
// to be delivered, or all if empty/nil.
|
||||||
|
async OutgoingWebhookSave(url: string, authorization: string, events: string[] | null): Promise<void> {
|
||||||
|
const fn: string = "OutgoingWebhookSave"
|
||||||
|
const paramTypes: string[][] = [["string"],["string"],["[]","string"]]
|
||||||
|
const returnTypes: string[][] = []
|
||||||
|
const params: any[] = [url, authorization, events]
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as void
|
||||||
|
}
|
||||||
|
|
||||||
|
// OutgoingWebhookTest makes a test webhook call to urlStr, with optional
|
||||||
|
// authorization. If the HTTP request is made this call will succeed also for
|
||||||
|
// non-2xx HTTP status codes.
|
||||||
|
async OutgoingWebhookTest(urlStr: string, authorization: string, data: Outgoing): Promise<[number, string, string]> {
|
||||||
|
const fn: string = "OutgoingWebhookTest"
|
||||||
|
const paramTypes: string[][] = [["string"],["string"],["Outgoing"]]
|
||||||
|
const returnTypes: string[][] = [["int32"],["string"],["string"]]
|
||||||
|
const params: any[] = [urlStr, authorization, data]
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as [number, string, string]
|
||||||
|
}
|
||||||
|
|
||||||
|
// IncomingWebhookSave saves a new webhook url for incoming deliveries. If url is
|
||||||
|
// empty, the webhook is disabled. If authorization is not empty, it is used in
|
||||||
|
// the Authorization header in requests.
|
||||||
|
async IncomingWebhookSave(url: string, authorization: string): Promise<void> {
|
||||||
|
const fn: string = "IncomingWebhookSave"
|
||||||
|
const paramTypes: string[][] = [["string"],["string"]]
|
||||||
|
const returnTypes: string[][] = []
|
||||||
|
const params: any[] = [url, authorization]
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as void
|
||||||
|
}
|
||||||
|
|
||||||
|
// IncomingWebhookTest makes a test webhook HTTP delivery request to urlStr,
|
||||||
|
// with optional authorization header. If the HTTP call is made, this function
|
||||||
|
// returns non-error regardless of HTTP status code.
|
||||||
|
async IncomingWebhookTest(urlStr: string, authorization: string, data: Incoming): Promise<[number, string, string]> {
|
||||||
|
const fn: string = "IncomingWebhookTest"
|
||||||
|
const paramTypes: string[][] = [["string"],["string"],["Incoming"]]
|
||||||
|
const returnTypes: string[][] = [["int32"],["string"],["string"]]
|
||||||
|
const params: any[] = [urlStr, authorization, data]
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as [number, string, string]
|
||||||
|
}
|
||||||
|
|
||||||
|
// FromIDLoginAddressesSave saves new login addresses to enable unique SMTP
|
||||||
|
// MAIL FROM addresses ("fromid") for deliveries from the queue.
|
||||||
|
async FromIDLoginAddressesSave(loginAddresses: string[] | null): Promise<void> {
|
||||||
|
const fn: string = "FromIDLoginAddressesSave"
|
||||||
|
const paramTypes: string[][] = [["[]","string"]]
|
||||||
|
const returnTypes: string[][] = []
|
||||||
|
const params: any[] = [loginAddresses]
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as void
|
||||||
|
}
|
||||||
|
|
||||||
|
// KeepRetiredPeriodsSave save periods to save retired messages and webhooks.
|
||||||
|
async KeepRetiredPeriodsSave(keepRetiredMessagePeriod: number, keepRetiredWebhookPeriod: number): Promise<void> {
|
||||||
|
const fn: string = "KeepRetiredPeriodsSave"
|
||||||
|
const paramTypes: string[][] = [["int64"],["int64"]]
|
||||||
|
const returnTypes: string[][] = []
|
||||||
|
const params: any[] = [keepRetiredMessagePeriod, keepRetiredWebhookPeriod]
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as void
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export const defaultBaseURL = (function() {
|
export const defaultBaseURL = (function() {
|
||||||
|
|
|
@ -1952,7 +1952,12 @@ func (Admin) SetPassword(ctx context.Context, accountName, password string) {
|
||||||
|
|
||||||
// AccountSettingsSave set new settings for an account that only an admin can set.
|
// AccountSettingsSave set new settings for an account that only an admin can set.
|
||||||
func (Admin) AccountSettingsSave(ctx context.Context, accountName string, maxOutgoingMessagesPerDay, maxFirstTimeRecipientsPerDay int, maxMsgSize int64, firstTimeSenderDelay bool) {
|
func (Admin) AccountSettingsSave(ctx context.Context, accountName string, maxOutgoingMessagesPerDay, maxFirstTimeRecipientsPerDay int, maxMsgSize int64, firstTimeSenderDelay bool) {
|
||||||
err := mox.AccountAdminSettingsSave(ctx, accountName, maxOutgoingMessagesPerDay, maxFirstTimeRecipientsPerDay, maxMsgSize, firstTimeSenderDelay)
|
err := mox.AccountSave(ctx, accountName, func(acc *config.Account) {
|
||||||
|
acc.MaxOutgoingMessagesPerDay = maxOutgoingMessagesPerDay
|
||||||
|
acc.MaxFirstTimeRecipientsPerDay = maxFirstTimeRecipientsPerDay
|
||||||
|
acc.QuotaMessageSize = maxMsgSize
|
||||||
|
acc.NoFirstTimeSenderDelay = !firstTimeSenderDelay
|
||||||
|
})
|
||||||
xcheckf(ctx, err, "saving account settings")
|
xcheckf(ctx, err, "saving account settings")
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -2005,8 +2010,8 @@ func (Admin) QueueHoldRuleRemove(ctx context.Context, holdRuleID int64) {
|
||||||
}
|
}
|
||||||
|
|
||||||
// QueueList returns the messages currently in the outgoing queue.
|
// QueueList returns the messages currently in the outgoing queue.
|
||||||
func (Admin) QueueList(ctx context.Context, filter queue.Filter) []queue.Msg {
|
func (Admin) QueueList(ctx context.Context, filter queue.Filter, sort queue.Sort) []queue.Msg {
|
||||||
l, err := queue.List(ctx, filter)
|
l, err := queue.List(ctx, filter, sort)
|
||||||
xcheckf(ctx, err, "listing messages in queue")
|
xcheckf(ctx, err, "listing messages in queue")
|
||||||
return l
|
return l
|
||||||
}
|
}
|
||||||
|
@ -2066,6 +2071,59 @@ func (Admin) QueueTransportSet(ctx context.Context, filter queue.Filter, transpo
|
||||||
return n
|
return n
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// RetiredList returns messages retired from the queue (delivery could
|
||||||
|
// have succeeded or failed).
|
||||||
|
func (Admin) RetiredList(ctx context.Context, filter queue.RetiredFilter, sort queue.RetiredSort) []queue.MsgRetired {
|
||||||
|
l, err := queue.RetiredList(ctx, filter, sort)
|
||||||
|
xcheckf(ctx, err, "listing retired messages")
|
||||||
|
return l
|
||||||
|
}
|
||||||
|
|
||||||
|
// HookQueueSize returns the number of webhooks still to be delivered.
|
||||||
|
func (Admin) HookQueueSize(ctx context.Context) int {
|
||||||
|
n, err := queue.HookQueueSize(ctx)
|
||||||
|
xcheckf(ctx, err, "get hook queue size")
|
||||||
|
return n
|
||||||
|
}
|
||||||
|
|
||||||
|
// HookList lists webhooks still to be delivered.
|
||||||
|
func (Admin) HookList(ctx context.Context, filter queue.HookFilter, sort queue.HookSort) []queue.Hook {
|
||||||
|
l, err := queue.HookList(ctx, filter, sort)
|
||||||
|
xcheckf(ctx, err, "listing hook queue")
|
||||||
|
return l
|
||||||
|
}
|
||||||
|
|
||||||
|
// HookNextAttemptSet sets a new time for next delivery attempt of matching
|
||||||
|
// hooks from the queue.
|
||||||
|
func (Admin) HookNextAttemptSet(ctx context.Context, filter queue.HookFilter, minutes int) (affected int) {
|
||||||
|
n, err := queue.HookNextAttemptSet(ctx, filter, time.Now().Add(time.Duration(minutes)*time.Minute))
|
||||||
|
xcheckf(ctx, err, "setting new next delivery attempt time for matching webhooks in queue")
|
||||||
|
return n
|
||||||
|
}
|
||||||
|
|
||||||
|
// HookNextAttemptAdd adds a duration to the time of next delivery attempt of
|
||||||
|
// matching hooks from the queue.
|
||||||
|
func (Admin) HookNextAttemptAdd(ctx context.Context, filter queue.HookFilter, minutes int) (affected int) {
|
||||||
|
n, err := queue.HookNextAttemptAdd(ctx, filter, time.Duration(minutes)*time.Minute)
|
||||||
|
xcheckf(ctx, err, "adding duration to next delivery attempt for matching webhooks in queue")
|
||||||
|
return n
|
||||||
|
}
|
||||||
|
|
||||||
|
// HookRetiredList lists retired webhooks.
|
||||||
|
func (Admin) HookRetiredList(ctx context.Context, filter queue.HookRetiredFilter, sort queue.HookRetiredSort) []queue.HookRetired {
|
||||||
|
l, err := queue.HookRetiredList(ctx, filter, sort)
|
||||||
|
xcheckf(ctx, err, "listing retired hooks")
|
||||||
|
return l
|
||||||
|
}
|
||||||
|
|
||||||
|
// HookCancel prevents further delivery attempts of matching webhooks.
|
||||||
|
func (Admin) HookCancel(ctx context.Context, filter queue.HookFilter) (affected int) {
|
||||||
|
log := pkglog.WithContext(ctx)
|
||||||
|
n, err := queue.HookCancel(ctx, log, filter)
|
||||||
|
xcheckf(ctx, err, "cancel hooks in queue")
|
||||||
|
return n
|
||||||
|
}
|
||||||
|
|
||||||
// LogLevels returns the current log levels.
|
// LogLevels returns the current log levels.
|
||||||
func (Admin) LogLevels(ctx context.Context) map[string]string {
|
func (Admin) LogLevels(ctx context.Context) map[string]string {
|
||||||
m := map[string]string{}
|
m := map[string]string{}
|
||||||
|
|
|
@ -14,6 +14,7 @@ h2 { font-size: 1.1rem; }
|
||||||
h3, h4 { font-size: 1rem; }
|
h3, h4 { font-size: 1rem; }
|
||||||
ul { padding-left: 1rem; }
|
ul { padding-left: 1rem; }
|
||||||
.literal { background-color: #eee; padding: .5em 1em; margin: 1ex 0; border: 1px solid #eee; border-radius: 4px; white-space: pre-wrap; font-family: monospace; font-size: 15px; tab-size: 4; }
|
.literal { background-color: #eee; padding: .5em 1em; margin: 1ex 0; border: 1px solid #eee; border-radius: 4px; white-space: pre-wrap; font-family: monospace; font-size: 15px; tab-size: 4; }
|
||||||
|
table { border-spacing: 0; }
|
||||||
table td, table th { padding: .2em .5em; }
|
table td, table th { padding: .2em .5em; }
|
||||||
table table td, table table th { padding: 0 0.1em; }
|
table table td, table table th { padding: 0 0.1em; }
|
||||||
table.long >tbody >tr >td { padding: 1em .5em; }
|
table.long >tbody >tr >td { padding: 1em .5em; }
|
||||||
|
@ -24,10 +25,16 @@ table.hover > tbody > tr:hover { background-color: #f0f0f0; }
|
||||||
p { margin-bottom: 1em; max-width: 50em; }
|
p { margin-bottom: 1em; max-width: 50em; }
|
||||||
[title] { text-decoration: underline; text-decoration-style: dotted; }
|
[title] { text-decoration: underline; text-decoration-style: dotted; }
|
||||||
fieldset { border: 0; }
|
fieldset { border: 0; }
|
||||||
|
.twocols { display: flex; gap: 2em; }
|
||||||
|
.unclutter { opacity: .5; }
|
||||||
|
.unclutter:hover { opacity: 1; }
|
||||||
|
@media (max-width:1910px) {
|
||||||
|
.twocols { display: block; gap: 2em; }
|
||||||
|
}
|
||||||
.scriptswitch { text-decoration: underline #dca053 2px; }
|
.scriptswitch { text-decoration: underline #dca053 2px; }
|
||||||
thead { position: sticky; top: 0; background-color: white; box-shadow: 0 1px 1px rgba(0, 0, 0, 0.1); }
|
thead { position: sticky; top: 0; background-color: white; box-shadow: 0 1px 1px rgba(0, 0, 0, 0.1); }
|
||||||
#page { opacity: 1; animation: fadein 0.15s ease-in; }
|
#page, .loadend { opacity: 1; animation: fadein 0.15s ease-in; }
|
||||||
#page.loading { opacity: 0.1; animation: fadeout 1s ease-out; }
|
#page.loading, .loadstart { opacity: 0.1; animation: fadeout 1s ease-out; }
|
||||||
@keyframes fadein { 0% { opacity: 0 } 100% { opacity: 1 } }
|
@keyframes fadein { 0% { opacity: 0 } 100% { opacity: 1 } }
|
||||||
@keyframes fadeout { 0% { opacity: 1 } 100% { opacity: 0.1 } }
|
@keyframes fadeout { 0% { opacity: 1 } 100% { opacity: 0.1 } }
|
||||||
</style>
|
</style>
|
||||||
|
|
|
@ -220,6 +220,7 @@ const [dom, style, attr, prop] = (function () {
|
||||||
autocomplete: (s) => _attr('autocomplete', s),
|
autocomplete: (s) => _attr('autocomplete', s),
|
||||||
list: (s) => _attr('list', s),
|
list: (s) => _attr('list', s),
|
||||||
form: (s) => _attr('form', s),
|
form: (s) => _attr('form', s),
|
||||||
|
size: (s) => _attr('size', s),
|
||||||
};
|
};
|
||||||
const style = (x) => { return { _styles: x }; };
|
const style = (x) => { return { _styles: x }; };
|
||||||
const prop = (x) => { return { _props: x }; };
|
const prop = (x) => { return { _props: x }; };
|
||||||
|
@ -336,7 +337,7 @@ var api;
|
||||||
SPFResult["SPFTemperror"] = "temperror";
|
SPFResult["SPFTemperror"] = "temperror";
|
||||||
SPFResult["SPFPermerror"] = "permerror";
|
SPFResult["SPFPermerror"] = "permerror";
|
||||||
})(SPFResult = api.SPFResult || (api.SPFResult = {}));
|
})(SPFResult = api.SPFResult || (api.SPFResult = {}));
|
||||||
api.structTypes = { "Account": true, "AuthResults": true, "AutoconfCheckResult": true, "AutodiscoverCheckResult": true, "AutodiscoverSRV": true, "AutomaticJunkFlags": true, "CheckResult": true, "ClientConfigs": true, "ClientConfigsEntry": true, "DANECheckResult": true, "DKIMAuthResult": true, "DKIMCheckResult": true, "DKIMRecord": true, "DMARCCheckResult": true, "DMARCRecord": true, "DMARCSummary": true, "DNSSECResult": true, "DateRange": true, "Destination": true, "Directive": true, "Domain": true, "DomainFeedback": true, "Evaluation": true, "EvaluationStat": true, "Extension": true, "FailureDetails": true, "Filter": true, "HoldRule": true, "IPDomain": true, "IPRevCheckResult": true, "Identifiers": true, "JunkFilter": true, "MTASTSCheckResult": true, "MTASTSRecord": true, "MX": true, "MXCheckResult": true, "Modifier": true, "Msg": true, "Pair": true, "Policy": true, "PolicyEvaluated": true, "PolicyOverrideReason": true, "PolicyPublished": true, "PolicyRecord": true, "Record": true, "Report": true, "ReportMetadata": true, "ReportRecord": true, "Result": true, "ResultPolicy": true, "Reverse": true, "Route": true, "Row": true, "Ruleset": true, "SMTPAuth": true, "SPFAuthResult": true, "SPFCheckResult": true, "SPFRecord": true, "SRV": true, "SRVConfCheckResult": true, "STSMX": true, "SubjectPass": true, "Summary": true, "SuppressAddress": true, "TLSCheckResult": true, "TLSRPTCheckResult": true, "TLSRPTDateRange": true, "TLSRPTRecord": true, "TLSRPTSummary": true, "TLSRPTSuppressAddress": true, "TLSReportRecord": true, "TLSResult": true, "Transport": true, "TransportDirect": true, "TransportSMTP": true, "TransportSocks": true, "URI": true, "WebForward": true, "WebHandler": true, "WebRedirect": true, "WebStatic": true, "WebserverConfig": true };
|
api.structTypes = { "Account": true, "AuthResults": true, "AutoconfCheckResult": true, "AutodiscoverCheckResult": true, "AutodiscoverSRV": true, "AutomaticJunkFlags": true, "CheckResult": true, "ClientConfigs": true, "ClientConfigsEntry": true, "DANECheckResult": true, "DKIMAuthResult": true, "DKIMCheckResult": true, "DKIMRecord": true, "DMARCCheckResult": true, "DMARCRecord": true, "DMARCSummary": true, "DNSSECResult": true, "DateRange": true, "Destination": true, "Directive": true, "Domain": true, "DomainFeedback": true, "Evaluation": true, "EvaluationStat": true, "Extension": true, "FailureDetails": true, "Filter": true, "HoldRule": true, "Hook": true, "HookFilter": true, "HookResult": true, "HookRetired": true, "HookRetiredFilter": true, "HookRetiredSort": true, "HookSort": true, "IPDomain": true, "IPRevCheckResult": true, "Identifiers": true, "IncomingWebhook": true, "JunkFilter": true, "MTASTSCheckResult": true, "MTASTSRecord": true, "MX": true, "MXCheckResult": true, "Modifier": true, "Msg": true, "MsgResult": true, "MsgRetired": true, "OutgoingWebhook": true, "Pair": true, "Policy": true, "PolicyEvaluated": true, "PolicyOverrideReason": true, "PolicyPublished": true, "PolicyRecord": true, "Record": true, "Report": true, "ReportMetadata": true, "ReportRecord": true, "Result": true, "ResultPolicy": true, "RetiredFilter": true, "RetiredSort": true, "Reverse": true, "Route": true, "Row": true, "Ruleset": true, "SMTPAuth": true, "SPFAuthResult": true, "SPFCheckResult": true, "SPFRecord": true, "SRV": true, "SRVConfCheckResult": true, "STSMX": true, "Sort": true, "SubjectPass": true, "Summary": true, "SuppressAddress": true, "TLSCheckResult": true, "TLSRPTCheckResult": true, "TLSRPTDateRange": true, "TLSRPTRecord": true, "TLSRPTSummary": true, "TLSRPTSuppressAddress": true, "TLSReportRecord": true, "TLSResult": true, "Transport": true, "TransportDirect": true, "TransportSMTP": true, "TransportSocks": true, "URI": true, "WebForward": true, "WebHandler": true, "WebRedirect": true, "WebStatic": true, "WebserverConfig": true };
|
||||||
api.stringsTypes = { "Align": true, "Alignment": true, "CSRFToken": true, "DKIMResult": true, "DMARCPolicy": true, "DMARCResult": true, "Disposition": true, "IP": true, "Localpart": true, "Mode": true, "PolicyOverride": true, "PolicyType": true, "RUA": true, "ResultType": true, "SPFDomainScope": true, "SPFResult": true };
|
api.stringsTypes = { "Align": true, "Alignment": true, "CSRFToken": true, "DKIMResult": true, "DMARCPolicy": true, "DMARCResult": true, "Disposition": true, "IP": true, "Localpart": true, "Mode": true, "PolicyOverride": true, "PolicyType": true, "RUA": true, "ResultType": true, "SPFDomainScope": true, "SPFResult": true };
|
||||||
api.intsTypes = {};
|
api.intsTypes = {};
|
||||||
api.types = {
|
api.types = {
|
||||||
|
@ -371,7 +372,9 @@ var api;
|
||||||
"AutoconfCheckResult": { "Name": "AutoconfCheckResult", "Docs": "", "Fields": [{ "Name": "ClientSettingsDomainIPs", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "IPs", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Errors", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Warnings", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Instructions", "Docs": "", "Typewords": ["[]", "string"] }] },
|
"AutoconfCheckResult": { "Name": "AutoconfCheckResult", "Docs": "", "Fields": [{ "Name": "ClientSettingsDomainIPs", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "IPs", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Errors", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Warnings", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Instructions", "Docs": "", "Typewords": ["[]", "string"] }] },
|
||||||
"AutodiscoverCheckResult": { "Name": "AutodiscoverCheckResult", "Docs": "", "Fields": [{ "Name": "Records", "Docs": "", "Typewords": ["[]", "AutodiscoverSRV"] }, { "Name": "Errors", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Warnings", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Instructions", "Docs": "", "Typewords": ["[]", "string"] }] },
|
"AutodiscoverCheckResult": { "Name": "AutodiscoverCheckResult", "Docs": "", "Fields": [{ "Name": "Records", "Docs": "", "Typewords": ["[]", "AutodiscoverSRV"] }, { "Name": "Errors", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Warnings", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "Instructions", "Docs": "", "Typewords": ["[]", "string"] }] },
|
||||||
"AutodiscoverSRV": { "Name": "AutodiscoverSRV", "Docs": "", "Fields": [{ "Name": "Target", "Docs": "", "Typewords": ["string"] }, { "Name": "Port", "Docs": "", "Typewords": ["uint16"] }, { "Name": "Priority", "Docs": "", "Typewords": ["uint16"] }, { "Name": "Weight", "Docs": "", "Typewords": ["uint16"] }, { "Name": "IPs", "Docs": "", "Typewords": ["[]", "string"] }] },
|
"AutodiscoverSRV": { "Name": "AutodiscoverSRV", "Docs": "", "Fields": [{ "Name": "Target", "Docs": "", "Typewords": ["string"] }, { "Name": "Port", "Docs": "", "Typewords": ["uint16"] }, { "Name": "Priority", "Docs": "", "Typewords": ["uint16"] }, { "Name": "Weight", "Docs": "", "Typewords": ["uint16"] }, { "Name": "IPs", "Docs": "", "Typewords": ["[]", "string"] }] },
|
||||||
"Account": { "Name": "Account", "Docs": "", "Fields": [{ "Name": "Domain", "Docs": "", "Typewords": ["string"] }, { "Name": "Description", "Docs": "", "Typewords": ["string"] }, { "Name": "FullName", "Docs": "", "Typewords": ["string"] }, { "Name": "Destinations", "Docs": "", "Typewords": ["{}", "Destination"] }, { "Name": "SubjectPass", "Docs": "", "Typewords": ["SubjectPass"] }, { "Name": "QuotaMessageSize", "Docs": "", "Typewords": ["int64"] }, { "Name": "RejectsMailbox", "Docs": "", "Typewords": ["string"] }, { "Name": "KeepRejects", "Docs": "", "Typewords": ["bool"] }, { "Name": "AutomaticJunkFlags", "Docs": "", "Typewords": ["AutomaticJunkFlags"] }, { "Name": "JunkFilter", "Docs": "", "Typewords": ["nullable", "JunkFilter"] }, { "Name": "MaxOutgoingMessagesPerDay", "Docs": "", "Typewords": ["int32"] }, { "Name": "MaxFirstTimeRecipientsPerDay", "Docs": "", "Typewords": ["int32"] }, { "Name": "NoFirstTimeSenderDelay", "Docs": "", "Typewords": ["bool"] }, { "Name": "Routes", "Docs": "", "Typewords": ["[]", "Route"] }, { "Name": "DNSDomain", "Docs": "", "Typewords": ["Domain"] }] },
|
"Account": { "Name": "Account", "Docs": "", "Fields": [{ "Name": "OutgoingWebhook", "Docs": "", "Typewords": ["nullable", "OutgoingWebhook"] }, { "Name": "IncomingWebhook", "Docs": "", "Typewords": ["nullable", "IncomingWebhook"] }, { "Name": "FromIDLoginAddresses", "Docs": "", "Typewords": ["[]", "string"] }, { "Name": "KeepRetiredMessagePeriod", "Docs": "", "Typewords": ["int64"] }, { "Name": "KeepRetiredWebhookPeriod", "Docs": "", "Typewords": ["int64"] }, { "Name": "Domain", "Docs": "", "Typewords": ["string"] }, { "Name": "Description", "Docs": "", "Typewords": ["string"] }, { "Name": "FullName", "Docs": "", "Typewords": ["string"] }, { "Name": "Destinations", "Docs": "", "Typewords": ["{}", "Destination"] }, { "Name": "SubjectPass", "Docs": "", "Typewords": ["SubjectPass"] }, { "Name": "QuotaMessageSize", "Docs": "", "Typewords": ["int64"] }, { "Name": "RejectsMailbox", "Docs": "", "Typewords": ["string"] }, { "Name": "KeepRejects", "Docs": "", "Typewords": ["bool"] }, { "Name": "AutomaticJunkFlags", "Docs": "", "Typewords": ["AutomaticJunkFlags"] }, { "Name": "JunkFilter", "Docs": "", "Typewords": ["nullable", "JunkFilter"] }, { "Name": "MaxOutgoingMessagesPerDay", "Docs": "", "Typewords": ["int32"] }, { "Name": "MaxFirstTimeRecipientsPerDay", "Docs": "", "Typewords": ["int32"] }, { "Name": "NoFirstTimeSenderDelay", "Docs": "", "Typewords": ["bool"] }, { "Name": "Routes", "Docs": "", "Typewords": ["[]", "Route"] }, { "Name": "DNSDomain", "Docs": "", "Typewords": ["Domain"] }] },
|
||||||
|
"OutgoingWebhook": { "Name": "OutgoingWebhook", "Docs": "", "Fields": [{ "Name": "URL", "Docs": "", "Typewords": ["string"] }, { "Name": "Authorization", "Docs": "", "Typewords": ["string"] }, { "Name": "Events", "Docs": "", "Typewords": ["[]", "string"] }] },
|
||||||
|
"IncomingWebhook": { "Name": "IncomingWebhook", "Docs": "", "Fields": [{ "Name": "URL", "Docs": "", "Typewords": ["string"] }, { "Name": "Authorization", "Docs": "", "Typewords": ["string"] }] },
|
||||||
"Destination": { "Name": "Destination", "Docs": "", "Fields": [{ "Name": "Mailbox", "Docs": "", "Typewords": ["string"] }, { "Name": "Rulesets", "Docs": "", "Typewords": ["[]", "Ruleset"] }, { "Name": "FullName", "Docs": "", "Typewords": ["string"] }] },
|
"Destination": { "Name": "Destination", "Docs": "", "Fields": [{ "Name": "Mailbox", "Docs": "", "Typewords": ["string"] }, { "Name": "Rulesets", "Docs": "", "Typewords": ["[]", "Ruleset"] }, { "Name": "FullName", "Docs": "", "Typewords": ["string"] }] },
|
||||||
"Ruleset": { "Name": "Ruleset", "Docs": "", "Fields": [{ "Name": "SMTPMailFromRegexp", "Docs": "", "Typewords": ["string"] }, { "Name": "VerifiedDomain", "Docs": "", "Typewords": ["string"] }, { "Name": "HeadersRegexp", "Docs": "", "Typewords": ["{}", "string"] }, { "Name": "IsForward", "Docs": "", "Typewords": ["bool"] }, { "Name": "ListAllowDomain", "Docs": "", "Typewords": ["string"] }, { "Name": "AcceptRejectsToMailbox", "Docs": "", "Typewords": ["string"] }, { "Name": "Mailbox", "Docs": "", "Typewords": ["string"] }, { "Name": "VerifiedDNSDomain", "Docs": "", "Typewords": ["Domain"] }, { "Name": "ListAllowDNSDomain", "Docs": "", "Typewords": ["Domain"] }] },
|
"Ruleset": { "Name": "Ruleset", "Docs": "", "Fields": [{ "Name": "SMTPMailFromRegexp", "Docs": "", "Typewords": ["string"] }, { "Name": "VerifiedDomain", "Docs": "", "Typewords": ["string"] }, { "Name": "HeadersRegexp", "Docs": "", "Typewords": ["{}", "string"] }, { "Name": "IsForward", "Docs": "", "Typewords": ["bool"] }, { "Name": "ListAllowDomain", "Docs": "", "Typewords": ["string"] }, { "Name": "AcceptRejectsToMailbox", "Docs": "", "Typewords": ["string"] }, { "Name": "Mailbox", "Docs": "", "Typewords": ["string"] }, { "Name": "VerifiedDNSDomain", "Docs": "", "Typewords": ["Domain"] }, { "Name": "ListAllowDNSDomain", "Docs": "", "Typewords": ["Domain"] }] },
|
||||||
"SubjectPass": { "Name": "SubjectPass", "Docs": "", "Fields": [{ "Name": "Period", "Docs": "", "Typewords": ["int64"] }] },
|
"SubjectPass": { "Name": "SubjectPass", "Docs": "", "Fields": [{ "Name": "Period", "Docs": "", "Typewords": ["int64"] }] },
|
||||||
|
@ -404,9 +407,21 @@ var api;
|
||||||
"ClientConfigs": { "Name": "ClientConfigs", "Docs": "", "Fields": [{ "Name": "Entries", "Docs": "", "Typewords": ["[]", "ClientConfigsEntry"] }] },
|
"ClientConfigs": { "Name": "ClientConfigs", "Docs": "", "Fields": [{ "Name": "Entries", "Docs": "", "Typewords": ["[]", "ClientConfigsEntry"] }] },
|
||||||
"ClientConfigsEntry": { "Name": "ClientConfigsEntry", "Docs": "", "Fields": [{ "Name": "Protocol", "Docs": "", "Typewords": ["string"] }, { "Name": "Host", "Docs": "", "Typewords": ["Domain"] }, { "Name": "Port", "Docs": "", "Typewords": ["int32"] }, { "Name": "Listener", "Docs": "", "Typewords": ["string"] }, { "Name": "Note", "Docs": "", "Typewords": ["string"] }] },
|
"ClientConfigsEntry": { "Name": "ClientConfigsEntry", "Docs": "", "Fields": [{ "Name": "Protocol", "Docs": "", "Typewords": ["string"] }, { "Name": "Host", "Docs": "", "Typewords": ["Domain"] }, { "Name": "Port", "Docs": "", "Typewords": ["int32"] }, { "Name": "Listener", "Docs": "", "Typewords": ["string"] }, { "Name": "Note", "Docs": "", "Typewords": ["string"] }] },
|
||||||
"HoldRule": { "Name": "HoldRule", "Docs": "", "Fields": [{ "Name": "ID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Account", "Docs": "", "Typewords": ["string"] }, { "Name": "SenderDomain", "Docs": "", "Typewords": ["Domain"] }, { "Name": "RecipientDomain", "Docs": "", "Typewords": ["Domain"] }, { "Name": "SenderDomainStr", "Docs": "", "Typewords": ["string"] }, { "Name": "RecipientDomainStr", "Docs": "", "Typewords": ["string"] }] },
|
"HoldRule": { "Name": "HoldRule", "Docs": "", "Fields": [{ "Name": "ID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Account", "Docs": "", "Typewords": ["string"] }, { "Name": "SenderDomain", "Docs": "", "Typewords": ["Domain"] }, { "Name": "RecipientDomain", "Docs": "", "Typewords": ["Domain"] }, { "Name": "SenderDomainStr", "Docs": "", "Typewords": ["string"] }, { "Name": "RecipientDomainStr", "Docs": "", "Typewords": ["string"] }] },
|
||||||
"Filter": { "Name": "Filter", "Docs": "", "Fields": [{ "Name": "IDs", "Docs": "", "Typewords": ["[]", "int64"] }, { "Name": "Account", "Docs": "", "Typewords": ["string"] }, { "Name": "From", "Docs": "", "Typewords": ["string"] }, { "Name": "To", "Docs": "", "Typewords": ["string"] }, { "Name": "Hold", "Docs": "", "Typewords": ["nullable", "bool"] }, { "Name": "Submitted", "Docs": "", "Typewords": ["string"] }, { "Name": "NextAttempt", "Docs": "", "Typewords": ["string"] }, { "Name": "Transport", "Docs": "", "Typewords": ["nullable", "string"] }] },
|
"Filter": { "Name": "Filter", "Docs": "", "Fields": [{ "Name": "Max", "Docs": "", "Typewords": ["int32"] }, { "Name": "IDs", "Docs": "", "Typewords": ["[]", "int64"] }, { "Name": "Account", "Docs": "", "Typewords": ["string"] }, { "Name": "From", "Docs": "", "Typewords": ["string"] }, { "Name": "To", "Docs": "", "Typewords": ["string"] }, { "Name": "Hold", "Docs": "", "Typewords": ["nullable", "bool"] }, { "Name": "Submitted", "Docs": "", "Typewords": ["string"] }, { "Name": "NextAttempt", "Docs": "", "Typewords": ["string"] }, { "Name": "Transport", "Docs": "", "Typewords": ["nullable", "string"] }] },
|
||||||
"Msg": { "Name": "Msg", "Docs": "", "Fields": [{ "Name": "ID", "Docs": "", "Typewords": ["int64"] }, { "Name": "BaseID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Queued", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "Hold", "Docs": "", "Typewords": ["bool"] }, { "Name": "SenderAccount", "Docs": "", "Typewords": ["string"] }, { "Name": "SenderLocalpart", "Docs": "", "Typewords": ["Localpart"] }, { "Name": "SenderDomain", "Docs": "", "Typewords": ["IPDomain"] }, { "Name": "SenderDomainStr", "Docs": "", "Typewords": ["string"] }, { "Name": "RecipientLocalpart", "Docs": "", "Typewords": ["Localpart"] }, { "Name": "RecipientDomain", "Docs": "", "Typewords": ["IPDomain"] }, { "Name": "RecipientDomainStr", "Docs": "", "Typewords": ["string"] }, { "Name": "Attempts", "Docs": "", "Typewords": ["int32"] }, { "Name": "MaxAttempts", "Docs": "", "Typewords": ["int32"] }, { "Name": "DialedIPs", "Docs": "", "Typewords": ["{}", "[]", "IP"] }, { "Name": "NextAttempt", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "LastAttempt", "Docs": "", "Typewords": ["nullable", "timestamp"] }, { "Name": "LastError", "Docs": "", "Typewords": ["string"] }, { "Name": "Has8bit", "Docs": "", "Typewords": ["bool"] }, { "Name": "SMTPUTF8", "Docs": "", "Typewords": ["bool"] }, { "Name": "IsDMARCReport", "Docs": "", "Typewords": ["bool"] }, { "Name": "IsTLSReport", "Docs": "", "Typewords": ["bool"] }, { "Name": "Size", "Docs": "", "Typewords": ["int64"] }, { "Name": "MessageID", "Docs": "", "Typewords": ["string"] }, { "Name": "MsgPrefix", "Docs": "", "Typewords": ["nullable", "string"] }, { "Name": "DSNUTF8", "Docs": "", "Typewords": ["nullable", "string"] }, { "Name": "Transport", "Docs": "", "Typewords": ["string"] }, { "Name": "RequireTLS", "Docs": "", "Typewords": ["nullable", "bool"] }, { "Name": "FutureReleaseRequest", "Docs": "", "Typewords": ["string"] }] },
|
"Sort": { "Name": "Sort", "Docs": "", "Fields": [{ "Name": "Field", "Docs": "", "Typewords": ["string"] }, { "Name": "LastID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Last", "Docs": "", "Typewords": ["any"] }, { "Name": "Asc", "Docs": "", "Typewords": ["bool"] }] },
|
||||||
|
"Msg": { "Name": "Msg", "Docs": "", "Fields": [{ "Name": "ID", "Docs": "", "Typewords": ["int64"] }, { "Name": "BaseID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Queued", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "Hold", "Docs": "", "Typewords": ["bool"] }, { "Name": "SenderAccount", "Docs": "", "Typewords": ["string"] }, { "Name": "SenderLocalpart", "Docs": "", "Typewords": ["Localpart"] }, { "Name": "SenderDomain", "Docs": "", "Typewords": ["IPDomain"] }, { "Name": "SenderDomainStr", "Docs": "", "Typewords": ["string"] }, { "Name": "FromID", "Docs": "", "Typewords": ["string"] }, { "Name": "RecipientLocalpart", "Docs": "", "Typewords": ["Localpart"] }, { "Name": "RecipientDomain", "Docs": "", "Typewords": ["IPDomain"] }, { "Name": "RecipientDomainStr", "Docs": "", "Typewords": ["string"] }, { "Name": "Attempts", "Docs": "", "Typewords": ["int32"] }, { "Name": "MaxAttempts", "Docs": "", "Typewords": ["int32"] }, { "Name": "DialedIPs", "Docs": "", "Typewords": ["{}", "[]", "IP"] }, { "Name": "NextAttempt", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "LastAttempt", "Docs": "", "Typewords": ["nullable", "timestamp"] }, { "Name": "Results", "Docs": "", "Typewords": ["[]", "MsgResult"] }, { "Name": "Has8bit", "Docs": "", "Typewords": ["bool"] }, { "Name": "SMTPUTF8", "Docs": "", "Typewords": ["bool"] }, { "Name": "IsDMARCReport", "Docs": "", "Typewords": ["bool"] }, { "Name": "IsTLSReport", "Docs": "", "Typewords": ["bool"] }, { "Name": "Size", "Docs": "", "Typewords": ["int64"] }, { "Name": "MessageID", "Docs": "", "Typewords": ["string"] }, { "Name": "MsgPrefix", "Docs": "", "Typewords": ["nullable", "string"] }, { "Name": "Subject", "Docs": "", "Typewords": ["string"] }, { "Name": "DSNUTF8", "Docs": "", "Typewords": ["nullable", "string"] }, { "Name": "Transport", "Docs": "", "Typewords": ["string"] }, { "Name": "RequireTLS", "Docs": "", "Typewords": ["nullable", "bool"] }, { "Name": "FutureReleaseRequest", "Docs": "", "Typewords": ["string"] }, { "Name": "Extra", "Docs": "", "Typewords": ["{}", "string"] }] },
|
||||||
"IPDomain": { "Name": "IPDomain", "Docs": "", "Fields": [{ "Name": "IP", "Docs": "", "Typewords": ["IP"] }, { "Name": "Domain", "Docs": "", "Typewords": ["Domain"] }] },
|
"IPDomain": { "Name": "IPDomain", "Docs": "", "Fields": [{ "Name": "IP", "Docs": "", "Typewords": ["IP"] }, { "Name": "Domain", "Docs": "", "Typewords": ["Domain"] }] },
|
||||||
|
"MsgResult": { "Name": "MsgResult", "Docs": "", "Fields": [{ "Name": "Start", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "Duration", "Docs": "", "Typewords": ["int64"] }, { "Name": "Success", "Docs": "", "Typewords": ["bool"] }, { "Name": "Code", "Docs": "", "Typewords": ["int32"] }, { "Name": "Secode", "Docs": "", "Typewords": ["string"] }, { "Name": "Error", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"RetiredFilter": { "Name": "RetiredFilter", "Docs": "", "Fields": [{ "Name": "Max", "Docs": "", "Typewords": ["int32"] }, { "Name": "IDs", "Docs": "", "Typewords": ["[]", "int64"] }, { "Name": "Account", "Docs": "", "Typewords": ["string"] }, { "Name": "From", "Docs": "", "Typewords": ["string"] }, { "Name": "To", "Docs": "", "Typewords": ["string"] }, { "Name": "Submitted", "Docs": "", "Typewords": ["string"] }, { "Name": "LastActivity", "Docs": "", "Typewords": ["string"] }, { "Name": "Transport", "Docs": "", "Typewords": ["nullable", "string"] }, { "Name": "Success", "Docs": "", "Typewords": ["nullable", "bool"] }] },
|
||||||
|
"RetiredSort": { "Name": "RetiredSort", "Docs": "", "Fields": [{ "Name": "Field", "Docs": "", "Typewords": ["string"] }, { "Name": "LastID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Last", "Docs": "", "Typewords": ["any"] }, { "Name": "Asc", "Docs": "", "Typewords": ["bool"] }] },
|
||||||
|
"MsgRetired": { "Name": "MsgRetired", "Docs": "", "Fields": [{ "Name": "ID", "Docs": "", "Typewords": ["int64"] }, { "Name": "BaseID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Queued", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "SenderAccount", "Docs": "", "Typewords": ["string"] }, { "Name": "SenderLocalpart", "Docs": "", "Typewords": ["Localpart"] }, { "Name": "SenderDomainStr", "Docs": "", "Typewords": ["string"] }, { "Name": "FromID", "Docs": "", "Typewords": ["string"] }, { "Name": "RecipientLocalpart", "Docs": "", "Typewords": ["Localpart"] }, { "Name": "RecipientDomain", "Docs": "", "Typewords": ["IPDomain"] }, { "Name": "RecipientDomainStr", "Docs": "", "Typewords": ["string"] }, { "Name": "Attempts", "Docs": "", "Typewords": ["int32"] }, { "Name": "MaxAttempts", "Docs": "", "Typewords": ["int32"] }, { "Name": "DialedIPs", "Docs": "", "Typewords": ["{}", "[]", "IP"] }, { "Name": "LastAttempt", "Docs": "", "Typewords": ["nullable", "timestamp"] }, { "Name": "Results", "Docs": "", "Typewords": ["[]", "MsgResult"] }, { "Name": "Has8bit", "Docs": "", "Typewords": ["bool"] }, { "Name": "SMTPUTF8", "Docs": "", "Typewords": ["bool"] }, { "Name": "IsDMARCReport", "Docs": "", "Typewords": ["bool"] }, { "Name": "IsTLSReport", "Docs": "", "Typewords": ["bool"] }, { "Name": "Size", "Docs": "", "Typewords": ["int64"] }, { "Name": "MessageID", "Docs": "", "Typewords": ["string"] }, { "Name": "Subject", "Docs": "", "Typewords": ["string"] }, { "Name": "Transport", "Docs": "", "Typewords": ["string"] }, { "Name": "RequireTLS", "Docs": "", "Typewords": ["nullable", "bool"] }, { "Name": "FutureReleaseRequest", "Docs": "", "Typewords": ["string"] }, { "Name": "Extra", "Docs": "", "Typewords": ["{}", "string"] }, { "Name": "LastActivity", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "RecipientAddress", "Docs": "", "Typewords": ["string"] }, { "Name": "Success", "Docs": "", "Typewords": ["bool"] }, { "Name": "KeepUntil", "Docs": "", "Typewords": ["timestamp"] }] },
|
||||||
|
"HookFilter": { "Name": "HookFilter", "Docs": "", "Fields": [{ "Name": "Max", "Docs": "", "Typewords": ["int32"] }, { "Name": "IDs", "Docs": "", "Typewords": ["[]", "int64"] }, { "Name": "Account", "Docs": "", "Typewords": ["string"] }, { "Name": "Submitted", "Docs": "", "Typewords": ["string"] }, { "Name": "NextAttempt", "Docs": "", "Typewords": ["string"] }, { "Name": "Event", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"HookSort": { "Name": "HookSort", "Docs": "", "Fields": [{ "Name": "Field", "Docs": "", "Typewords": ["string"] }, { "Name": "LastID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Last", "Docs": "", "Typewords": ["any"] }, { "Name": "Asc", "Docs": "", "Typewords": ["bool"] }] },
|
||||||
|
"Hook": { "Name": "Hook", "Docs": "", "Fields": [{ "Name": "ID", "Docs": "", "Typewords": ["int64"] }, { "Name": "QueueMsgID", "Docs": "", "Typewords": ["int64"] }, { "Name": "FromID", "Docs": "", "Typewords": ["string"] }, { "Name": "MessageID", "Docs": "", "Typewords": ["string"] }, { "Name": "Subject", "Docs": "", "Typewords": ["string"] }, { "Name": "Extra", "Docs": "", "Typewords": ["{}", "string"] }, { "Name": "Account", "Docs": "", "Typewords": ["string"] }, { "Name": "URL", "Docs": "", "Typewords": ["string"] }, { "Name": "Authorization", "Docs": "", "Typewords": ["string"] }, { "Name": "IsIncoming", "Docs": "", "Typewords": ["bool"] }, { "Name": "OutgoingEvent", "Docs": "", "Typewords": ["string"] }, { "Name": "Payload", "Docs": "", "Typewords": ["string"] }, { "Name": "Submitted", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "Attempts", "Docs": "", "Typewords": ["int32"] }, { "Name": "NextAttempt", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "Results", "Docs": "", "Typewords": ["[]", "HookResult"] }] },
|
||||||
|
"HookResult": { "Name": "HookResult", "Docs": "", "Fields": [{ "Name": "Start", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "Duration", "Docs": "", "Typewords": ["int64"] }, { "Name": "URL", "Docs": "", "Typewords": ["string"] }, { "Name": "Success", "Docs": "", "Typewords": ["bool"] }, { "Name": "Code", "Docs": "", "Typewords": ["int32"] }, { "Name": "Error", "Docs": "", "Typewords": ["string"] }, { "Name": "Response", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"HookRetiredFilter": { "Name": "HookRetiredFilter", "Docs": "", "Fields": [{ "Name": "Max", "Docs": "", "Typewords": ["int32"] }, { "Name": "IDs", "Docs": "", "Typewords": ["[]", "int64"] }, { "Name": "Account", "Docs": "", "Typewords": ["string"] }, { "Name": "Submitted", "Docs": "", "Typewords": ["string"] }, { "Name": "LastActivity", "Docs": "", "Typewords": ["string"] }, { "Name": "Event", "Docs": "", "Typewords": ["string"] }] },
|
||||||
|
"HookRetiredSort": { "Name": "HookRetiredSort", "Docs": "", "Fields": [{ "Name": "Field", "Docs": "", "Typewords": ["string"] }, { "Name": "LastID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Last", "Docs": "", "Typewords": ["any"] }, { "Name": "Asc", "Docs": "", "Typewords": ["bool"] }] },
|
||||||
|
"HookRetired": { "Name": "HookRetired", "Docs": "", "Fields": [{ "Name": "ID", "Docs": "", "Typewords": ["int64"] }, { "Name": "QueueMsgID", "Docs": "", "Typewords": ["int64"] }, { "Name": "FromID", "Docs": "", "Typewords": ["string"] }, { "Name": "MessageID", "Docs": "", "Typewords": ["string"] }, { "Name": "Subject", "Docs": "", "Typewords": ["string"] }, { "Name": "Extra", "Docs": "", "Typewords": ["{}", "string"] }, { "Name": "Account", "Docs": "", "Typewords": ["string"] }, { "Name": "URL", "Docs": "", "Typewords": ["string"] }, { "Name": "Authorization", "Docs": "", "Typewords": ["bool"] }, { "Name": "IsIncoming", "Docs": "", "Typewords": ["bool"] }, { "Name": "OutgoingEvent", "Docs": "", "Typewords": ["string"] }, { "Name": "Payload", "Docs": "", "Typewords": ["string"] }, { "Name": "Submitted", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "SupersededByID", "Docs": "", "Typewords": ["int64"] }, { "Name": "Attempts", "Docs": "", "Typewords": ["int32"] }, { "Name": "Results", "Docs": "", "Typewords": ["[]", "HookResult"] }, { "Name": "Success", "Docs": "", "Typewords": ["bool"] }, { "Name": "LastActivity", "Docs": "", "Typewords": ["timestamp"] }, { "Name": "KeepUntil", "Docs": "", "Typewords": ["timestamp"] }] },
|
||||||
"WebserverConfig": { "Name": "WebserverConfig", "Docs": "", "Fields": [{ "Name": "WebDNSDomainRedirects", "Docs": "", "Typewords": ["[]", "[]", "Domain"] }, { "Name": "WebDomainRedirects", "Docs": "", "Typewords": ["[]", "[]", "string"] }, { "Name": "WebHandlers", "Docs": "", "Typewords": ["[]", "WebHandler"] }] },
|
"WebserverConfig": { "Name": "WebserverConfig", "Docs": "", "Fields": [{ "Name": "WebDNSDomainRedirects", "Docs": "", "Typewords": ["[]", "[]", "Domain"] }, { "Name": "WebDomainRedirects", "Docs": "", "Typewords": ["[]", "[]", "string"] }, { "Name": "WebHandlers", "Docs": "", "Typewords": ["[]", "WebHandler"] }] },
|
||||||
"WebHandler": { "Name": "WebHandler", "Docs": "", "Fields": [{ "Name": "LogName", "Docs": "", "Typewords": ["string"] }, { "Name": "Domain", "Docs": "", "Typewords": ["string"] }, { "Name": "PathRegexp", "Docs": "", "Typewords": ["string"] }, { "Name": "DontRedirectPlainHTTP", "Docs": "", "Typewords": ["bool"] }, { "Name": "Compress", "Docs": "", "Typewords": ["bool"] }, { "Name": "WebStatic", "Docs": "", "Typewords": ["nullable", "WebStatic"] }, { "Name": "WebRedirect", "Docs": "", "Typewords": ["nullable", "WebRedirect"] }, { "Name": "WebForward", "Docs": "", "Typewords": ["nullable", "WebForward"] }, { "Name": "Name", "Docs": "", "Typewords": ["string"] }, { "Name": "DNSDomain", "Docs": "", "Typewords": ["Domain"] }] },
|
"WebHandler": { "Name": "WebHandler", "Docs": "", "Fields": [{ "Name": "LogName", "Docs": "", "Typewords": ["string"] }, { "Name": "Domain", "Docs": "", "Typewords": ["string"] }, { "Name": "PathRegexp", "Docs": "", "Typewords": ["string"] }, { "Name": "DontRedirectPlainHTTP", "Docs": "", "Typewords": ["bool"] }, { "Name": "Compress", "Docs": "", "Typewords": ["bool"] }, { "Name": "WebStatic", "Docs": "", "Typewords": ["nullable", "WebStatic"] }, { "Name": "WebRedirect", "Docs": "", "Typewords": ["nullable", "WebRedirect"] }, { "Name": "WebForward", "Docs": "", "Typewords": ["nullable", "WebForward"] }, { "Name": "Name", "Docs": "", "Typewords": ["string"] }, { "Name": "DNSDomain", "Docs": "", "Typewords": ["Domain"] }] },
|
||||||
"WebStatic": { "Name": "WebStatic", "Docs": "", "Fields": [{ "Name": "StripPrefix", "Docs": "", "Typewords": ["string"] }, { "Name": "Root", "Docs": "", "Typewords": ["string"] }, { "Name": "ListFiles", "Docs": "", "Typewords": ["bool"] }, { "Name": "ContinueNotFound", "Docs": "", "Typewords": ["bool"] }, { "Name": "ResponseHeaders", "Docs": "", "Typewords": ["{}", "string"] }] },
|
"WebStatic": { "Name": "WebStatic", "Docs": "", "Fields": [{ "Name": "StripPrefix", "Docs": "", "Typewords": ["string"] }, { "Name": "Root", "Docs": "", "Typewords": ["string"] }, { "Name": "ListFiles", "Docs": "", "Typewords": ["bool"] }, { "Name": "ContinueNotFound", "Docs": "", "Typewords": ["bool"] }, { "Name": "ResponseHeaders", "Docs": "", "Typewords": ["{}", "string"] }] },
|
||||||
|
@ -472,6 +487,8 @@ var api;
|
||||||
AutodiscoverCheckResult: (v) => api.parse("AutodiscoverCheckResult", v),
|
AutodiscoverCheckResult: (v) => api.parse("AutodiscoverCheckResult", v),
|
||||||
AutodiscoverSRV: (v) => api.parse("AutodiscoverSRV", v),
|
AutodiscoverSRV: (v) => api.parse("AutodiscoverSRV", v),
|
||||||
Account: (v) => api.parse("Account", v),
|
Account: (v) => api.parse("Account", v),
|
||||||
|
OutgoingWebhook: (v) => api.parse("OutgoingWebhook", v),
|
||||||
|
IncomingWebhook: (v) => api.parse("IncomingWebhook", v),
|
||||||
Destination: (v) => api.parse("Destination", v),
|
Destination: (v) => api.parse("Destination", v),
|
||||||
Ruleset: (v) => api.parse("Ruleset", v),
|
Ruleset: (v) => api.parse("Ruleset", v),
|
||||||
SubjectPass: (v) => api.parse("SubjectPass", v),
|
SubjectPass: (v) => api.parse("SubjectPass", v),
|
||||||
|
@ -505,8 +522,20 @@ var api;
|
||||||
ClientConfigsEntry: (v) => api.parse("ClientConfigsEntry", v),
|
ClientConfigsEntry: (v) => api.parse("ClientConfigsEntry", v),
|
||||||
HoldRule: (v) => api.parse("HoldRule", v),
|
HoldRule: (v) => api.parse("HoldRule", v),
|
||||||
Filter: (v) => api.parse("Filter", v),
|
Filter: (v) => api.parse("Filter", v),
|
||||||
|
Sort: (v) => api.parse("Sort", v),
|
||||||
Msg: (v) => api.parse("Msg", v),
|
Msg: (v) => api.parse("Msg", v),
|
||||||
IPDomain: (v) => api.parse("IPDomain", v),
|
IPDomain: (v) => api.parse("IPDomain", v),
|
||||||
|
MsgResult: (v) => api.parse("MsgResult", v),
|
||||||
|
RetiredFilter: (v) => api.parse("RetiredFilter", v),
|
||||||
|
RetiredSort: (v) => api.parse("RetiredSort", v),
|
||||||
|
MsgRetired: (v) => api.parse("MsgRetired", v),
|
||||||
|
HookFilter: (v) => api.parse("HookFilter", v),
|
||||||
|
HookSort: (v) => api.parse("HookSort", v),
|
||||||
|
Hook: (v) => api.parse("Hook", v),
|
||||||
|
HookResult: (v) => api.parse("HookResult", v),
|
||||||
|
HookRetiredFilter: (v) => api.parse("HookRetiredFilter", v),
|
||||||
|
HookRetiredSort: (v) => api.parse("HookRetiredSort", v),
|
||||||
|
HookRetired: (v) => api.parse("HookRetired", v),
|
||||||
WebserverConfig: (v) => api.parse("WebserverConfig", v),
|
WebserverConfig: (v) => api.parse("WebserverConfig", v),
|
||||||
WebHandler: (v) => api.parse("WebHandler", v),
|
WebHandler: (v) => api.parse("WebHandler", v),
|
||||||
WebStatic: (v) => api.parse("WebStatic", v),
|
WebStatic: (v) => api.parse("WebStatic", v),
|
||||||
|
@ -868,11 +897,11 @@ var api;
|
||||||
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
}
|
}
|
||||||
// QueueList returns the messages currently in the outgoing queue.
|
// QueueList returns the messages currently in the outgoing queue.
|
||||||
async QueueList(filter) {
|
async QueueList(filter, sort) {
|
||||||
const fn = "QueueList";
|
const fn = "QueueList";
|
||||||
const paramTypes = [["Filter"]];
|
const paramTypes = [["Filter"], ["Sort"]];
|
||||||
const returnTypes = [["[]", "Msg"]];
|
const returnTypes = [["[]", "Msg"]];
|
||||||
const params = [filter];
|
const params = [filter, sort];
|
||||||
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
}
|
}
|
||||||
// QueueNextAttemptSet sets a new time for next delivery attempt of matching
|
// QueueNextAttemptSet sets a new time for next delivery attempt of matching
|
||||||
|
@ -935,6 +964,65 @@ var api;
|
||||||
const params = [filter, transport];
|
const params = [filter, transport];
|
||||||
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
}
|
}
|
||||||
|
// RetiredList returns messages retired from the queue (delivery could
|
||||||
|
// have succeeded or failed).
|
||||||
|
async RetiredList(filter, sort) {
|
||||||
|
const fn = "RetiredList";
|
||||||
|
const paramTypes = [["RetiredFilter"], ["RetiredSort"]];
|
||||||
|
const returnTypes = [["[]", "MsgRetired"]];
|
||||||
|
const params = [filter, sort];
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// HookQueueSize returns the number of webhooks still to be delivered.
|
||||||
|
async HookQueueSize() {
|
||||||
|
const fn = "HookQueueSize";
|
||||||
|
const paramTypes = [];
|
||||||
|
const returnTypes = [["int32"]];
|
||||||
|
const params = [];
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// HookList lists webhooks still to be delivered.
|
||||||
|
async HookList(filter, sort) {
|
||||||
|
const fn = "HookList";
|
||||||
|
const paramTypes = [["HookFilter"], ["HookSort"]];
|
||||||
|
const returnTypes = [["[]", "Hook"]];
|
||||||
|
const params = [filter, sort];
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// HookNextAttemptSet sets a new time for next delivery attempt of matching
|
||||||
|
// hooks from the queue.
|
||||||
|
async HookNextAttemptSet(filter, minutes) {
|
||||||
|
const fn = "HookNextAttemptSet";
|
||||||
|
const paramTypes = [["HookFilter"], ["int32"]];
|
||||||
|
const returnTypes = [["int32"]];
|
||||||
|
const params = [filter, minutes];
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// HookNextAttemptAdd adds a duration to the time of next delivery attempt of
|
||||||
|
// matching hooks from the queue.
|
||||||
|
async HookNextAttemptAdd(filter, minutes) {
|
||||||
|
const fn = "HookNextAttemptAdd";
|
||||||
|
const paramTypes = [["HookFilter"], ["int32"]];
|
||||||
|
const returnTypes = [["int32"]];
|
||||||
|
const params = [filter, minutes];
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// HookRetiredList lists retired webhooks.
|
||||||
|
async HookRetiredList(filter, sort) {
|
||||||
|
const fn = "HookRetiredList";
|
||||||
|
const paramTypes = [["HookRetiredFilter"], ["HookRetiredSort"]];
|
||||||
|
const returnTypes = [["[]", "HookRetired"]];
|
||||||
|
const params = [filter, sort];
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
|
// HookCancel prevents further delivery attempts of matching webhooks.
|
||||||
|
async HookCancel(filter) {
|
||||||
|
const fn = "HookCancel";
|
||||||
|
const paramTypes = [["HookFilter"]];
|
||||||
|
const returnTypes = [["int32"]];
|
||||||
|
const params = [filter];
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params);
|
||||||
|
}
|
||||||
// LogLevels returns the current log levels.
|
// LogLevels returns the current log levels.
|
||||||
async LogLevels() {
|
async LogLevels() {
|
||||||
const fn = "LogLevels";
|
const fn = "LogLevels";
|
||||||
|
@ -1516,6 +1604,37 @@ const login = async (reason) => {
|
||||||
password.focus();
|
password.focus();
|
||||||
});
|
});
|
||||||
};
|
};
|
||||||
|
// Popup shows kids in a centered div with white background on top of a
|
||||||
|
// transparent overlay on top of the window. Clicking the overlay or hitting
|
||||||
|
// Escape closes the popup. Scrollbars are automatically added to the div with
|
||||||
|
// kids. Returns a function that removes the popup.
|
||||||
|
const popup = (...kids) => {
|
||||||
|
const origFocus = document.activeElement;
|
||||||
|
const close = () => {
|
||||||
|
if (!root.parentNode) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
root.remove();
|
||||||
|
if (origFocus && origFocus instanceof HTMLElement && origFocus.parentNode) {
|
||||||
|
origFocus.focus();
|
||||||
|
}
|
||||||
|
};
|
||||||
|
let content;
|
||||||
|
const root = dom.div(style({ position: 'fixed', top: 0, right: 0, bottom: 0, left: 0, backgroundColor: 'rgba(0, 0, 0, 0.1)', display: 'flex', alignItems: 'center', justifyContent: 'center', zIndex: '1' }), function keydown(e) {
|
||||||
|
if (e.key === 'Escape') {
|
||||||
|
e.stopPropagation();
|
||||||
|
close();
|
||||||
|
}
|
||||||
|
}, function click(e) {
|
||||||
|
e.stopPropagation();
|
||||||
|
close();
|
||||||
|
}, content = dom.div(attr.tabindex('0'), style({ backgroundColor: 'white', borderRadius: '.25em', padding: '1em', boxShadow: '0 0 20px rgba(0, 0, 0, 0.1)', border: '1px solid #ddd', maxWidth: '95vw', overflowX: 'auto', maxHeight: '95vh', overflowY: 'auto' }), function click(e) {
|
||||||
|
e.stopPropagation();
|
||||||
|
}, kids));
|
||||||
|
document.body.appendChild(root);
|
||||||
|
content.focus();
|
||||||
|
return close;
|
||||||
|
};
|
||||||
const localStorageGet = (k) => {
|
const localStorageGet = (k) => {
|
||||||
try {
|
try {
|
||||||
return window.localStorage.getItem(k);
|
return window.localStorage.getItem(k);
|
||||||
|
@ -1709,9 +1828,10 @@ const formatSize = (n) => {
|
||||||
return n + ' bytes';
|
return n + ' bytes';
|
||||||
};
|
};
|
||||||
const index = async () => {
|
const index = async () => {
|
||||||
const [domains, queueSize, checkUpdatesEnabled, accounts] = await Promise.all([
|
const [domains, queueSize, hooksQueueSize, checkUpdatesEnabled, accounts] = await Promise.all([
|
||||||
client.Domains(),
|
client.Domains(),
|
||||||
client.QueueSize(),
|
client.QueueSize(),
|
||||||
|
client.HookQueueSize(),
|
||||||
client.CheckUpdatesEnabled(),
|
client.CheckUpdatesEnabled(),
|
||||||
client.Accounts(),
|
client.Accounts(),
|
||||||
]);
|
]);
|
||||||
|
@ -1722,7 +1842,7 @@ const index = async () => {
|
||||||
let recvIDFieldset;
|
let recvIDFieldset;
|
||||||
let recvID;
|
let recvID;
|
||||||
let cidElem;
|
let cidElem;
|
||||||
dom._kids(page, crumbs('Mox Admin'), checkUpdatesEnabled ? [] : dom.p(box(yellow, 'Warning: Checking for updates has not been enabled in mox.conf (CheckUpdates: true).', dom.br(), 'Make sure you stay up to date through another mechanism!', dom.br(), 'You have a responsibility to keep the internet-connected software you run up to date and secure!', dom.br(), 'See ', link('https://updates.xmox.nl/changelog'))), dom.p(dom.a('Accounts', attr.href('#accounts')), dom.br(), dom.a('Queue', attr.href('#queue')), ' (' + queueSize + ')', dom.br()), dom.h2('Domains'), (domains || []).length === 0 ? box(red, 'No domains') :
|
dom._kids(page, crumbs('Mox Admin'), checkUpdatesEnabled ? [] : dom.p(box(yellow, 'Warning: Checking for updates has not been enabled in mox.conf (CheckUpdates: true).', dom.br(), 'Make sure you stay up to date through another mechanism!', dom.br(), 'You have a responsibility to keep the internet-connected software you run up to date and secure!', dom.br(), 'See ', link('https://updates.xmox.nl/changelog'))), dom.p(dom.a('Accounts', attr.href('#accounts')), dom.br(), dom.a('Queue', attr.href('#queue')), ' (' + queueSize + ')', dom.br(), dom.a('Webhook queue', attr.href('#webhookqueue')), ' (' + hooksQueueSize + ')', dom.br()), dom.h2('Domains'), (domains || []).length === 0 ? box(red, 'No domains') :
|
||||||
dom.ul((domains || []).map(d => dom.li(dom.a(attr.href('#domains/' + domainName(d)), domainString(d))))), dom.br(), dom.h2('Add domain'), dom.form(async function submit(e) {
|
dom.ul((domains || []).map(d => dom.li(dom.a(attr.href('#domains/' + domainName(d)), domainString(d))))), dom.br(), dom.h2('Add domain'), dom.form(async function submit(e) {
|
||||||
e.preventDefault();
|
e.preventDefault();
|
||||||
e.stopPropagation();
|
e.stopPropagation();
|
||||||
|
@ -1830,6 +1950,7 @@ const account = async (name) => {
|
||||||
client.Account(name),
|
client.Account(name),
|
||||||
client.Domains(),
|
client.Domains(),
|
||||||
]);
|
]);
|
||||||
|
// todo: show suppression list, and buttons to add/remove entries.
|
||||||
let form;
|
let form;
|
||||||
let fieldset;
|
let fieldset;
|
||||||
let localpart;
|
let localpart;
|
||||||
|
@ -2096,7 +2217,7 @@ const dmarcEvaluations = async () => {
|
||||||
let until;
|
let until;
|
||||||
let comment;
|
let comment;
|
||||||
const nextmonth = new Date(new Date().getTime() + 31 * 24 * 3600 * 1000);
|
const nextmonth = new Date(new Date().getTime() + 31 * 24 * 3600 * 1000);
|
||||||
dom._kids(page, crumbs(crumblink('Mox Admin', '#'), crumblink('DMARC', '#dmarc'), 'Evaluations'), dom.p('Incoming messages are checked against the DMARC policy of the domain in the message From header. If the policy requests reporting on the resulting evaluations, they are stored in the database. Each interval of 1 to 24 hours, the evaluations may be sent to a reporting address specified in the domain\'s DMARC policy. Not all evaluations are a reason to send a report, but if a report is sent all evaluations are included.'), dom.table(dom._class('hover'), dom.thead(dom.tr(dom.th('Domain', attr.title('Domain in the message From header. Keep in mind these can be forged, so this does not necessarily mean someone from this domain authentically tried delivering email.')), dom.th('Dispositions', attr.title('Unique dispositions occurring in report.')), dom.th('Evaluations', attr.title('Total number of message delivery attempts, including retries.')), dom.th('Send report', attr.title('Whether the current evaluations will cause a report to be sent.')))), dom.tbody(Object.entries(evalStats).sort((a, b) => a[0] < b[0] ? -1 : 1).map(t => dom.tr(dom.td(dom.a(attr.href('#dmarc/evaluations/' + domainName(t[1].Domain)), domainString(t[1].Domain))), dom.td((t[1].Dispositions || []).join(' ')), dom.td(style({ textAlign: 'right' }), '' + t[1].Count), dom.td(style({ textAlign: 'right' }), t[1].SendReport ? '✓' : ''))), isEmpty(evalStats) ? dom.tr(dom.td(attr.colspan('3'), 'No evaluations.')) : [])), dom.br(), dom.br(), dom.h2('Suppressed reporting addresses'), dom.p('In practice, sending a DMARC report to a reporting address can cause DSN to be sent back. Such addresses can be added to a supression list for a period, to reduce noise in the postmaster mailbox.'), dom.form(async function submit(e) {
|
dom._kids(page, crumbs(crumblink('Mox Admin', '#'), crumblink('DMARC', '#dmarc'), 'Evaluations'), dom.p('Incoming messages are checked against the DMARC policy of the domain in the message From header. If the policy requests reporting on the resulting evaluations, they are stored in the database. Each interval of 1 to 24 hours, the evaluations may be sent to a reporting address specified in the domain\'s DMARC policy. Not all evaluations are a reason to send a report, but if a report is sent all evaluations are included.'), dom.table(dom._class('hover'), dom.thead(dom.tr(dom.th('Domain', attr.title('Domain in the message From header. Keep in mind these can be forged, so this does not necessarily mean someone from this domain authentically tried delivering email.')), dom.th('Dispositions', attr.title('Unique dispositions occurring in report.')), dom.th('Evaluations', attr.title('Total number of message delivery attempts, including retries.')), dom.th('Send report', attr.title('Whether the current evaluations will cause a report to be sent.')))), dom.tbody(Object.entries(evalStats).sort((a, b) => a[0] < b[0] ? -1 : 1).map(t => dom.tr(dom.td(dom.a(attr.href('#dmarc/evaluations/' + domainName(t[1].Domain)), domainString(t[1].Domain))), dom.td((t[1].Dispositions || []).join(' ')), dom.td(style({ textAlign: 'right' }), '' + t[1].Count), dom.td(style({ textAlign: 'right' }), t[1].SendReport ? '✓' : ''))), isEmpty(evalStats) ? dom.tr(dom.td(attr.colspan('3'), 'No evaluations.')) : [])), dom.br(), dom.br(), dom.h2('Suppressed reporting addresses'), dom.p('In practice, sending a DMARC report to a reporting address can cause DSN to be sent back. Such addresses can be added to a suppression list for a period, to reduce noise in the postmaster mailbox.'), dom.form(async function submit(e) {
|
||||||
e.stopPropagation();
|
e.stopPropagation();
|
||||||
e.preventDefault();
|
e.preventDefault();
|
||||||
await check(fieldset, client.DMARCSuppressAdd(reportingAddress.value, new Date(until.value), comment.value));
|
await check(fieldset, client.DMARCSuppressAdd(reportingAddress.value, new Date(until.value), comment.value));
|
||||||
|
@ -2538,21 +2659,26 @@ const dnsbl = async () => {
|
||||||
}, fieldset = dom.fieldset(dom.div('One per line'), dom.div(style({ marginBottom: '.5ex' }), monitorTextarea = dom.textarea(style({ width: '20rem' }), attr.rows('' + Math.max(5, 1 + (monitorZones || []).length)), new String((monitorZones || []).map(zone => domainName(zone)).join('\n'))), dom.div('Examples: sbl.spamhaus.org or bl.spamcop.net')), dom.div(dom.submitbutton('Save')))));
|
}, fieldset = dom.fieldset(dom.div('One per line'), dom.div(style({ marginBottom: '.5ex' }), monitorTextarea = dom.textarea(style({ width: '20rem' }), attr.rows('' + Math.max(5, 1 + (monitorZones || []).length)), new String((monitorZones || []).map(zone => domainName(zone)).join('\n'))), dom.div('Examples: sbl.spamhaus.org or bl.spamcop.net')), dom.div(dom.submitbutton('Save')))));
|
||||||
};
|
};
|
||||||
const queueList = async () => {
|
const queueList = async () => {
|
||||||
let [holdRules, msgs, transports] = await Promise.all([
|
let filter = { Max: parseInt(localStorageGet('adminpaginationsize') || '') || 100, IDs: [], Account: '', From: '', To: '', Hold: null, Submitted: '', NextAttempt: '', Transport: null };
|
||||||
|
let sort = { Field: "NextAttempt", LastID: 0, Last: null, Asc: true };
|
||||||
|
let [holdRules, msgs0, transports] = await Promise.all([
|
||||||
client.QueueHoldRuleList(),
|
client.QueueHoldRuleList(),
|
||||||
client.QueueList({ IDs: [], Account: '', From: '', To: '', Hold: null, Submitted: '', NextAttempt: '', Transport: null }),
|
client.QueueList(filter, sort),
|
||||||
client.Transports(),
|
client.Transports(),
|
||||||
]);
|
]);
|
||||||
// todo: sorting by address/timestamps/attempts.
|
let msgs = msgs0 || [];
|
||||||
|
// todo: more sorting
|
||||||
// todo: after making changes, don't reload entire page. probably best to fetch messages by id and rerender. also report on which messages weren't affected (e.g. no longer in queue).
|
// todo: after making changes, don't reload entire page. probably best to fetch messages by id and rerender. also report on which messages weren't affected (e.g. no longer in queue).
|
||||||
// todo: display which transport will be used for a message according to routing rules (in case none is explicitly configured).
|
// todo: display which transport will be used for a message according to routing rules (in case none is explicitly configured).
|
||||||
// todo: live updates with SSE connections
|
// todo: live updates with SSE connections
|
||||||
// todo: keep updating times/age.
|
// todo: keep updating times/age.
|
||||||
|
// todo: reuse this code in webaccount to show users their own message queue, and give (more limited) options to fail/reschedule deliveries.
|
||||||
const nowSecs = new Date().getTime() / 1000;
|
const nowSecs = new Date().getTime() / 1000;
|
||||||
let holdRuleAccount;
|
let holdRuleAccount;
|
||||||
let holdRuleSenderDomain;
|
let holdRuleSenderDomain;
|
||||||
let holdRuleRecipientDomain;
|
let holdRuleRecipientDomain;
|
||||||
let holdRuleSubmit;
|
let holdRuleSubmit;
|
||||||
|
let sortElem;
|
||||||
let filterForm;
|
let filterForm;
|
||||||
let filterAccount;
|
let filterAccount;
|
||||||
let filterFrom;
|
let filterFrom;
|
||||||
|
@ -2571,6 +2697,7 @@ const queueList = async () => {
|
||||||
// syntax when calling this as parameter in api client calls below.
|
// syntax when calling this as parameter in api client calls below.
|
||||||
const gatherIDs = () => {
|
const gatherIDs = () => {
|
||||||
const f = {
|
const f = {
|
||||||
|
Max: 0,
|
||||||
IDs: Array.from(toggles.entries()).filter(t => t[1].checked).map(t => t[0]),
|
IDs: Array.from(toggles.entries()).filter(t => t[1].checked).map(t => t[0]),
|
||||||
Account: '',
|
Account: '',
|
||||||
From: '',
|
From: '',
|
||||||
|
@ -2586,17 +2713,25 @@ const queueList = async () => {
|
||||||
}
|
}
|
||||||
return f;
|
return f;
|
||||||
};
|
};
|
||||||
const tbody = dom.tbody();
|
const popupDetails = (m) => {
|
||||||
|
const nowSecs = new Date().getTime() / 1000;
|
||||||
|
popup(dom.h1('Details'), dom.table(dom.tr(dom.td('Message subject'), dom.td(m.Subject))), dom.br(), dom.h2('Results'), dom.table(dom.thead(dom.tr(dom.th('Start'), dom.th('Duration'), dom.th('Success'), dom.th('Code'), dom.th('Secode'), dom.th('Error'))), dom.tbody((m.Results || []).length === 0 ? dom.tr(dom.td(attr.colspan('6'), 'No results.')) : [], (m.Results || []).map(r => dom.tr(dom.td(age(r.Start, false, nowSecs)), dom.td(Math.round(r.Duration / 1000000) + 'ms'), dom.td(r.Success ? '✓' : ''), dom.td('' + (r.Code || '')), dom.td(r.Secode), dom.td(r.Error))))));
|
||||||
|
};
|
||||||
|
let tbody = dom.tbody();
|
||||||
const render = () => {
|
const render = () => {
|
||||||
toggles = new Map();
|
toggles = new Map();
|
||||||
for (const m of (msgs || [])) {
|
for (const m of msgs) {
|
||||||
toggles.set(m.ID, dom.input(attr.type('checkbox'), attr.checked('')));
|
toggles.set(m.ID, dom.input(attr.type('checkbox'), msgs.length === 1 ? attr.checked('') : []));
|
||||||
}
|
}
|
||||||
dom._kids(tbody, (msgs || []).length === 0 ? dom.tr(dom.td(attr.colspan('14'), 'No messages.')) : [], (msgs || []).map(m => {
|
const ntbody = dom.tbody(dom._class('loadend'), msgs.length === 0 ? dom.tr(dom.td(attr.colspan('15'), 'No messages.')) : [], msgs.map(m => {
|
||||||
return dom.tr(dom.td(toggles.get(m.ID)), dom.td('' + m.ID + (m.BaseID > 0 ? '/' + m.BaseID : '')), dom.td(age(new Date(m.Queued), false, nowSecs)), dom.td(m.SenderAccount || '-'), dom.td(m.SenderLocalpart + "@" + ipdomainString(m.SenderDomain)), // todo: escaping of localpart
|
return dom.tr(dom.td(toggles.get(m.ID)), dom.td('' + m.ID + (m.BaseID > 0 ? '/' + m.BaseID : '')), dom.td(age(new Date(m.Queued), false, nowSecs)), dom.td(m.SenderAccount || '-'), dom.td(m.SenderLocalpart + "@" + ipdomainString(m.SenderDomain)), // todo: escaping of localpart
|
||||||
dom.td(m.RecipientLocalpart + "@" + ipdomainString(m.RecipientDomain)), // todo: escaping of localpart
|
dom.td(m.RecipientLocalpart + "@" + ipdomainString(m.RecipientDomain)), // todo: escaping of localpart
|
||||||
dom.td(formatSize(m.Size)), dom.td('' + m.Attempts), dom.td(m.Hold ? 'Hold' : ''), dom.td(age(new Date(m.NextAttempt), true, nowSecs)), dom.td(m.LastAttempt ? age(new Date(m.LastAttempt), false, nowSecs) : '-'), dom.td(m.LastError || '-'), dom.td(m.RequireTLS === true ? 'Yes' : (m.RequireTLS === false ? 'No' : 'Default')), dom.td(m.Transport || '(default)'));
|
dom.td(formatSize(m.Size)), dom.td('' + m.Attempts), dom.td(m.Hold ? 'Hold' : ''), dom.td(age(new Date(m.NextAttempt), true, nowSecs)), dom.td(m.LastAttempt ? age(new Date(m.LastAttempt), false, nowSecs) : '-'), dom.td(m.Results && m.Results.length > 0 ? m.Results[m.Results.length - 1].Error : []), dom.td(m.Transport || '(default)'), dom.td(m.RequireTLS === true ? 'Yes' : (m.RequireTLS === false ? 'No' : '')), dom.td(dom.clickbutton('Details', function click() {
|
||||||
|
popupDetails(m);
|
||||||
|
})));
|
||||||
}));
|
}));
|
||||||
|
tbody.replaceWith(ntbody);
|
||||||
|
tbody = ntbody;
|
||||||
};
|
};
|
||||||
render();
|
render();
|
||||||
const buttonNextAttemptSet = (text, minutes) => dom.clickbutton(text, async function click(e) {
|
const buttonNextAttemptSet = (text, minutes) => dom.clickbutton(text, async function click(e) {
|
||||||
|
@ -2610,7 +2745,7 @@ const queueList = async () => {
|
||||||
window.alert('' + n + ' message(s) updated');
|
window.alert('' + n + ' message(s) updated');
|
||||||
window.location.reload(); // todo: reload less
|
window.location.reload(); // todo: reload less
|
||||||
});
|
});
|
||||||
dom._kids(page, crumbs(crumblink('Mox Admin', '#'), 'Queue'), dom.h2('Hold rules', attr.title('Messages submitted to the queue that match a hold rule are automatically marked as "on hold", preventing delivery until explicitly taken off hold again.')), dom.form(attr.id('holdRuleForm'), async function submit(e) {
|
dom._kids(page, crumbs(crumblink('Mox Admin', '#'), 'Queue'), dom.p(dom.a(attr.href('#queue/retired'), 'Retired messages')), dom.h2('Hold rules', attr.title('Messages submitted to the queue that match a hold rule are automatically marked as "on hold", preventing delivery until explicitly taken off hold again.')), dom.form(attr.id('holdRuleForm'), async function submit(e) {
|
||||||
e.preventDefault();
|
e.preventDefault();
|
||||||
e.stopPropagation();
|
e.stopPropagation();
|
||||||
const pr = {
|
const pr = {
|
||||||
|
@ -2654,7 +2789,8 @@ const queueList = async () => {
|
||||||
async function submit(e) {
|
async function submit(e) {
|
||||||
e.preventDefault();
|
e.preventDefault();
|
||||||
e.stopPropagation();
|
e.stopPropagation();
|
||||||
const filter = {
|
filter = {
|
||||||
|
Max: filter.Max,
|
||||||
IDs: [],
|
IDs: [],
|
||||||
Account: filterAccount.value,
|
Account: filterAccount.value,
|
||||||
From: filterFrom.value,
|
From: filterFrom.value,
|
||||||
|
@ -2664,24 +2800,54 @@ const queueList = async () => {
|
||||||
NextAttempt: filterNextAttempt.value,
|
NextAttempt: filterNextAttempt.value,
|
||||||
Transport: !filterTransport.value ? null : (filterTransport.value === '(default)' ? '' : filterTransport.value),
|
Transport: !filterTransport.value ? null : (filterTransport.value === '(default)' ? '' : filterTransport.value),
|
||||||
};
|
};
|
||||||
dom._kids(tbody);
|
sort = {
|
||||||
msgs = await check({ disabled: false }, client.QueueList(filter));
|
Field: sortElem.value.startsWith('nextattempt') ? 'NextAttempt' : 'Queued',
|
||||||
|
LastID: 0,
|
||||||
|
Last: null,
|
||||||
|
Asc: sortElem.value.endsWith('asc'),
|
||||||
|
};
|
||||||
|
tbody.classList.add('loadstart');
|
||||||
|
msgs = await check({ disabled: false }, client.QueueList(filter, sort)) || [];
|
||||||
render();
|
render();
|
||||||
}), dom.h2('Messages'), dom.table(dom._class('hover'), dom.thead(dom.tr(dom.th(), dom.th('ID'), dom.th('Submitted'), dom.th('Account'), dom.th('From'), dom.th('To'), dom.th('Size'), dom.th('Attempts'), dom.th('Hold'), dom.th('Next attempt'), dom.th('Last attempt'), dom.th('Last error'), dom.th('Require TLS'), dom.th('Transport'), dom.th()), dom.tr(dom.td(dom.input(attr.type('checkbox'), attr.checked(''), attr.form('queuefilter'), function change(e) {
|
}), dom.h2('Messages'), dom.table(dom._class('hover'), style({ width: '100%' }), dom.thead(dom.tr(dom.td(attr.colspan('2'), 'Filter'), dom.td(filterSubmitted = dom.input(attr.form('queuefilter'), style({ width: '7em' }), attr.title('Example: "<-1h" for filtering messages submitted more than 1 hour ago.'))), dom.td(filterAccount = dom.input(attr.form('queuefilter'))), dom.td(filterFrom = dom.input(attr.form('queuefilter')), attr.title('Example: "@sender.example" to filter by domain of sender.')), dom.td(filterTo = dom.input(attr.form('queuefilter')), attr.title('Example: "@recipient.example" to filter by domain of recipient.')), dom.td(), // todo: add filter by size?
|
||||||
|
dom.td(), // todo: add filter by attempts?
|
||||||
|
dom.td(filterHold = dom.select(attr.form('queuefilter'), function change() {
|
||||||
|
filterForm.requestSubmit();
|
||||||
|
}, dom.option('', attr.value('')), dom.option('Yes'), dom.option('No'))), dom.td(filterNextAttempt = dom.input(attr.form('queuefilter'), style({ width: '7em' }), attr.title('Example: ">1h" for filtering messages to be delivered in more than 1 hour, or "<now" for messages to be delivered as soon as possible.'))), dom.td(), dom.td(), dom.td(filterTransport = dom.select(Object.keys(transports || {}).length === 0 ? style({ display: 'none' }) : [], attr.form('queuefilter'), function change() {
|
||||||
|
filterForm.requestSubmit();
|
||||||
|
}, dom.option(''), dom.option('(default)'), Object.keys(transports || {}).sort().map(t => dom.option(t)))), dom.td(attr.colspan('2'), style({ textAlign: 'right' }), // Less content shifting while rendering.
|
||||||
|
'Sort ', sortElem = dom.select(attr.form('queuefilter'), function change() {
|
||||||
|
filterForm.requestSubmit();
|
||||||
|
}, dom.option('Next attempt ↑', attr.value('nextattempt-asc')), dom.option('Next attempt ↓', attr.value('nextattempt-desc')), dom.option('Submitted ↑', attr.value('submitted-asc')), dom.option('Submitted ↓', attr.value('submitted-desc'))), ' ', dom.submitbutton('Apply', attr.form('queuefilter')), ' ', dom.clickbutton('Reset', attr.form('queuefilter'), function click() {
|
||||||
|
filterForm.reset();
|
||||||
|
filterForm.requestSubmit();
|
||||||
|
}))), dom.tr(dom.td(dom.input(attr.type('checkbox'), msgs.length === 1 ? attr.checked('') : [], attr.form('queuefilter'), function change(e) {
|
||||||
const elem = e.target;
|
const elem = e.target;
|
||||||
for (const [_, toggle] of toggles) {
|
for (const [_, toggle] of toggles) {
|
||||||
toggle.checked = elem.checked;
|
toggle.checked = elem.checked;
|
||||||
}
|
}
|
||||||
})), dom.td(), dom.td(filterSubmitted = dom.input(attr.form('queuefilter'), style({ width: '7em' }), attr.title('Example: "<1h" for filtering messages submitted more than 1 minute ago.'))), dom.td(filterAccount = dom.input(attr.form('queuefilter'))), dom.td(filterFrom = dom.input(attr.form('queuefilter')), attr.title('Example: "@sender.example" to filter by domain of sender.')), dom.td(filterTo = dom.input(attr.form('queuefilter')), attr.title('Example: "@recipient.example" to filter by domain of recipient.')), dom.td(), // todo: add filter by size?
|
})), dom.th('ID'), dom.th('Submitted'), dom.th('Account'), dom.th('From'), dom.th('To'), dom.th('Size'), dom.th('Attempts'), dom.th('Hold'), dom.th('Next attempt'), dom.th('Last attempt'), dom.th('Last error'), dom.th('Transport'), dom.th('Require TLS'), dom.th('Actions'))), tbody, dom.tfoot(dom.tr(dom.td(attr.colspan('15'),
|
||||||
dom.td(), // todo: add filter by attempts?
|
// todo: consider implementing infinite scroll, autoloading more pages. means the operations on selected messages should be moved from below to above the table. and probably only show them when at least one message is selected to prevent clutter.
|
||||||
dom.td(filterHold = dom.select(attr.form('queuefilter'), dom.option('', attr.value('')), dom.option('Yes'), dom.option('No'), function change() {
|
dom.clickbutton('Load more', attr.title('Try to load more entries. You can still try to load more entries when at the end of the list, new entries may have been appended since the previous call.'), async function click(e) {
|
||||||
filterForm.requestSubmit();
|
if (msgs.length === 0) {
|
||||||
})), dom.td(filterNextAttempt = dom.input(attr.form('queuefilter'), style({ width: '7em' }), attr.title('Example: ">1h" for filtering messages to be delivered in more than 1 hour, or "<now" for messages to be delivered as soon as possible.'))), dom.td(), dom.td(), dom.td(), dom.td(filterTransport = dom.select(Object.keys(transports || []).length === 0 ? style({ display: 'none' }) : [], attr.form('queuefilter'), function change() {
|
sort.LastID = 0;
|
||||||
filterForm.requestSubmit();
|
sort.Last = null;
|
||||||
}, dom.option(''), dom.option('(default)'), Object.keys(transports || []).sort().map(t => dom.option(t)))), dom.td(dom.submitbutton('Filter', attr.form('queuefilter')), ' ', dom.clickbutton('Reset', attr.form('queuefilter'), function click() {
|
}
|
||||||
filterForm.reset();
|
else {
|
||||||
filterForm.requestSubmit();
|
const lm = msgs[msgs.length - 1];
|
||||||
})))), tbody), dom.br(), dom.br(), dom.h2('Change selected messages'), dom.div(style({ display: 'flex', gap: '2em' }), dom.div(dom.div('Hold'), dom.div(dom.clickbutton('On', async function click(e) {
|
sort.LastID = lm.ID;
|
||||||
|
if (sort.Field === "Queued") {
|
||||||
|
sort.Last = lm.Queued;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
sort.Last = lm.NextAttempt;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
tbody.classList.add('loadstart');
|
||||||
|
const l = await check(e.target, client.QueueList(filter, sort)) || [];
|
||||||
|
msgs.push(...l);
|
||||||
|
render();
|
||||||
|
}))))), dom.br(), dom.br(), dom.div(dom._class('unclutter'), dom.h2('Change selected messages'), dom.div(style({ display: 'flex', gap: '2em' }), dom.div(dom.div('Hold'), dom.div(dom.clickbutton('On', async function click(e) {
|
||||||
const n = await check(e.target, (async () => await client.QueueHoldSet(gatherIDs(), true))());
|
const n = await check(e.target, (async () => await client.QueueHoldSet(gatherIDs(), true))());
|
||||||
window.alert('' + n + ' message(s) updated');
|
window.alert('' + n + ' message(s) updated');
|
||||||
window.location.reload(); // todo: reload less
|
window.location.reload(); // todo: reload less
|
||||||
|
@ -2705,7 +2871,7 @@ const queueList = async () => {
|
||||||
window.location.reload(); // todo: only refresh the list
|
window.location.reload(); // todo: only refresh the list
|
||||||
})), dom.div(dom.div('Delivery'), dom.clickbutton('Fail delivery', attr.title('Cause delivery to fail, sending a DSN to the sender.'), async function click(e) {
|
})), dom.div(dom.div('Delivery'), dom.clickbutton('Fail delivery', attr.title('Cause delivery to fail, sending a DSN to the sender.'), async function click(e) {
|
||||||
e.preventDefault();
|
e.preventDefault();
|
||||||
if (!window.confirm('Are you sure you want to remove this message? Notifications of delivery failure will be sent (DSNs).')) {
|
if (!window.confirm('Are you sure you want to fail delivery for the selected message(s)? Notifications of delivery failure will be sent (DSNs).')) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
const n = await check(e.target, (async () => await client.QueueFail(gatherIDs()))());
|
const n = await check(e.target, (async () => await client.QueueFail(gatherIDs()))());
|
||||||
|
@ -2713,13 +2879,320 @@ const queueList = async () => {
|
||||||
window.location.reload(); // todo: only refresh the list
|
window.location.reload(); // todo: only refresh the list
|
||||||
})), dom.div(dom.div('Messages'), dom.clickbutton('Remove', attr.title('Completely remove messages from queue, not sending a DSN.'), async function click(e) {
|
})), dom.div(dom.div('Messages'), dom.clickbutton('Remove', attr.title('Completely remove messages from queue, not sending a DSN.'), async function click(e) {
|
||||||
e.preventDefault();
|
e.preventDefault();
|
||||||
if (!window.confirm('Are you sure you want to remove this message? It will be removed completely, no DSN about failure to deliver will be sent.')) {
|
if (!window.confirm('Are you sure you want to fail delivery for the selected message(s)? It will be removed completely, no DSN about failure to deliver will be sent.')) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
const n = await check(e.target, (async () => await client.QueueDrop(gatherIDs()))());
|
const n = await check(e.target, (async () => await client.QueueDrop(gatherIDs()))());
|
||||||
window.alert('' + n + ' message(s) updated');
|
window.alert('' + n + ' message(s) updated');
|
||||||
window.location.reload(); // todo: only refresh the list
|
window.location.reload(); // todo: only refresh the list
|
||||||
}))));
|
})))));
|
||||||
|
};
|
||||||
|
const retiredList = async () => {
|
||||||
|
let filter = { Max: parseInt(localStorageGet('adminpaginationsize') || '') || 100, IDs: [], Account: '', From: '', To: '', Submitted: '', LastActivity: '', Transport: null };
|
||||||
|
let sort = { Field: "LastActivity", LastID: 0, Last: null, Asc: false };
|
||||||
|
const [retired0, transports0] = await Promise.all([
|
||||||
|
client.RetiredList(filter, sort),
|
||||||
|
client.Transports(),
|
||||||
|
]);
|
||||||
|
let retired = retired0 || [];
|
||||||
|
let transports = transports0 || {};
|
||||||
|
const nowSecs = new Date().getTime() / 1000;
|
||||||
|
let sortElem;
|
||||||
|
let filterForm;
|
||||||
|
let filterAccount;
|
||||||
|
let filterFrom;
|
||||||
|
let filterTo;
|
||||||
|
let filterSubmitted;
|
||||||
|
let filterLastActivity;
|
||||||
|
let filterTransport;
|
||||||
|
let filterSuccess;
|
||||||
|
const popupDetails = (m) => {
|
||||||
|
const nowSecs = new Date().getTime() / 1000;
|
||||||
|
popup(dom.h1('Details'), dom.table(dom.tr(dom.td('Message subject'), dom.td(m.Subject))), dom.br(), dom.h2('Results'), dom.table(dom.thead(dom.tr(dom.th('Start'), dom.th('Duration'), dom.th('Success'), dom.th('Code'), dom.th('Secode'), dom.th('Error'))), dom.tbody((m.Results || []).length === 0 ? dom.tr(dom.td(attr.colspan('6'), 'No results.')) : [], (m.Results || []).map(r => dom.tr(dom.td(age(r.Start, false, nowSecs)), dom.td(Math.round(r.Duration / 1000000) + 'ms'), dom.td(r.Success ? '✓' : ''), dom.td('' + (r.Code || '')), dom.td(r.Secode), dom.td(r.Error))))));
|
||||||
|
};
|
||||||
|
let tbody = dom.tbody();
|
||||||
|
const render = () => {
|
||||||
|
const ntbody = dom.tbody(dom._class('loadend'), retired.length === 0 ? dom.tr(dom.td(attr.colspan('14'), 'No retired messages.')) : [], retired.map(m => dom.tr(dom.td('' + m.ID + (m.BaseID > 0 ? '/' + m.BaseID : '')), dom.td(m.Success ? '✓' : ''), dom.td(age(new Date(m.LastActivity), false, nowSecs)), dom.td(age(new Date(m.Queued), false, nowSecs)), dom.td(m.SenderAccount || '-'), dom.td(m.SenderLocalpart + "@" + m.SenderDomainStr), // todo: escaping of localpart
|
||||||
|
dom.td(m.RecipientLocalpart + "@" + m.RecipientDomainStr), // todo: escaping of localpart
|
||||||
|
dom.td(formatSize(m.Size)), dom.td('' + m.Attempts), dom.td(m.LastAttempt ? age(new Date(m.LastAttempt), false, nowSecs) : '-'), dom.td(m.Results && m.Results.length > 0 ? m.Results[m.Results.length - 1].Error : []), dom.td(m.Transport || ''), dom.td(m.RequireTLS === true ? 'Yes' : (m.RequireTLS === false ? 'No' : '')), dom.td(dom.clickbutton('Details', function click() {
|
||||||
|
popupDetails(m);
|
||||||
|
})))));
|
||||||
|
tbody.replaceWith(ntbody);
|
||||||
|
tbody = ntbody;
|
||||||
|
};
|
||||||
|
render();
|
||||||
|
dom._kids(page, crumbs(crumblink('Mox Admin', '#'), crumblink('Queue', '#queue'), 'Retired messages'),
|
||||||
|
// Filtering.
|
||||||
|
filterForm = dom.form(attr.id('queuefilter'), // Referenced by input elements in table row.
|
||||||
|
async function submit(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
e.stopPropagation();
|
||||||
|
filter = {
|
||||||
|
Max: filter.Max,
|
||||||
|
IDs: [],
|
||||||
|
Account: filterAccount.value,
|
||||||
|
From: filterFrom.value,
|
||||||
|
To: filterTo.value,
|
||||||
|
Submitted: filterSubmitted.value,
|
||||||
|
LastActivity: filterLastActivity.value,
|
||||||
|
Transport: !filterTransport.value ? null : (filterTransport.value === '(default)' ? '' : filterTransport.value),
|
||||||
|
Success: filterSuccess.value === '' ? null : (filterSuccess.value === 'Yes' ? true : false),
|
||||||
|
};
|
||||||
|
sort = {
|
||||||
|
Field: sortElem.value.startsWith('lastactivity') ? 'LastActivity' : 'Queued',
|
||||||
|
LastID: 0,
|
||||||
|
Last: null,
|
||||||
|
Asc: sortElem.value.endsWith('asc'),
|
||||||
|
};
|
||||||
|
tbody.classList.add('loadstart');
|
||||||
|
retired = await check({ disabled: false }, client.RetiredList(filter, sort)) || [];
|
||||||
|
render();
|
||||||
|
}), dom.h2('Retired messages'), dom.p('Meta information about queued messages may be kept after successful and/or failed delivery, configurable per account.'), dom.table(dom._class('hover'), style({ width: '100%' }), dom.thead(dom.tr(dom.td('Filter'), dom.td(filterSuccess = dom.select(attr.form('queuefilter'), function change() {
|
||||||
|
filterForm.requestSubmit();
|
||||||
|
}, dom.option(''), dom.option('Yes'), dom.option('No'))), dom.td(filterLastActivity = dom.input(attr.form('queuefilter'), style({ width: '7em' }), attr.title('Example: ">-1h" for filtering messages with last activity less than 1 hour ago.'))), dom.td(filterSubmitted = dom.input(attr.form('queuefilter'), style({ width: '7em' }), attr.title('Example: "<-1h" for filtering messages submitted more than 1 hour ago.'))), dom.td(filterAccount = dom.input(attr.form('queuefilter'))), dom.td(filterFrom = dom.input(attr.form('queuefilter')), attr.title('Example: "@sender.example" to filter by domain of sender.')), dom.td(filterTo = dom.input(attr.form('queuefilter')), attr.title('Example: "@recipient.example" to filter by domain of recipient.')), dom.td(), // todo: add filter by size?
|
||||||
|
dom.td(), // todo: add filter by attempts?
|
||||||
|
dom.td(), dom.td(), dom.td(filterTransport = dom.select(Object.keys(transports).length === 0 ? style({ display: 'none' }) : [], attr.form('queuefilter'), function change() {
|
||||||
|
filterForm.requestSubmit();
|
||||||
|
}, dom.option(''), dom.option('(default)'), Object.keys(transports).sort().map(t => dom.option(t)))), dom.td(attr.colspan('2'), style({ textAlign: 'right' }), // Less content shifting while rendering.
|
||||||
|
'Sort ', sortElem = dom.select(attr.form('queuefilter'), function change() {
|
||||||
|
filterForm.requestSubmit();
|
||||||
|
}, dom.option('Last activity ↓', attr.value('lastactivity-desc')), dom.option('Last activity ↑', attr.value('lastactivity-asc')), dom.option('Submitted ↓', attr.value('submitted-desc')), dom.option('Submitted ↑', attr.value('submitted-asc'))), ' ', dom.submitbutton('Apply', attr.form('queuefilter')), ' ', dom.clickbutton('Reset', attr.form('queuefilter'), function click() {
|
||||||
|
filterForm.reset();
|
||||||
|
filterForm.requestSubmit();
|
||||||
|
}))), dom.tr(dom.th('ID'), dom.th('Success'), dom.th('Last activity'), dom.th('Submitted'), dom.th('Account'), dom.th('From'), dom.th('To'), dom.th('Size'), dom.th('Attempts'), dom.th('Last attempt'), dom.th('Last error'), dom.th('Require TLS'), dom.th('Transport'), dom.th('Actions'))), tbody, dom.tfoot(dom.tr(dom.td(attr.colspan('14'), dom.clickbutton('Load more', attr.title('Try to load more entries. You can still try to load more entries when at the end of the list, new entries may have been appended since the previous call.'), async function click(e) {
|
||||||
|
if (retired.length === 0) {
|
||||||
|
sort.LastID = 0;
|
||||||
|
sort.Last = null;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
const lm = retired[retired.length - 1];
|
||||||
|
sort.LastID = lm.ID;
|
||||||
|
if (sort.Field === "Queued") {
|
||||||
|
sort.Last = lm.Queued;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
sort.Last = lm.LastActivity;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
tbody.classList.add('loadstart');
|
||||||
|
const l = await check(e.target, client.RetiredList(filter, sort)) || [];
|
||||||
|
retired.push(...l);
|
||||||
|
render();
|
||||||
|
}))))));
|
||||||
|
};
|
||||||
|
const formatExtra = (extra) => {
|
||||||
|
if (!extra) {
|
||||||
|
return '';
|
||||||
|
}
|
||||||
|
return Object.entries(extra).sort((a, b) => a[0] < b[0] ? -1 : 1).map(t => t[0] + ': ' + t[1]).join('; ');
|
||||||
|
};
|
||||||
|
const hooksList = async () => {
|
||||||
|
let filter = { Max: parseInt(localStorageGet('adminpaginationsize') || '') || 100, IDs: [], Account: '', Submitted: '', NextAttempt: '', Event: '' };
|
||||||
|
let sort = { Field: "NextAttempt", LastID: 0, Last: null, Asc: true };
|
||||||
|
let hooks = await client.HookList(filter, sort) || [];
|
||||||
|
const nowSecs = new Date().getTime() / 1000;
|
||||||
|
let sortElem;
|
||||||
|
let filterForm;
|
||||||
|
let filterSubmitted;
|
||||||
|
let filterAccount;
|
||||||
|
let filterEvent;
|
||||||
|
let filterNextAttempt;
|
||||||
|
// Hook ID to checkbox.
|
||||||
|
let toggles = new Map();
|
||||||
|
// We operate on what the user has selected, not what the filters would currently
|
||||||
|
// evaluate to. This function can throw an error, which is why we have awkward
|
||||||
|
// syntax when calling this as parameter in api client calls below.
|
||||||
|
const gatherIDs = () => {
|
||||||
|
const f = {
|
||||||
|
Max: 0,
|
||||||
|
IDs: Array.from(toggles.entries()).filter(t => t[1].checked).map(t => t[0]),
|
||||||
|
Account: '',
|
||||||
|
Event: '',
|
||||||
|
Submitted: '',
|
||||||
|
NextAttempt: '',
|
||||||
|
};
|
||||||
|
// Don't want to accidentally operate on all messages.
|
||||||
|
if ((f.IDs || []).length === 0) {
|
||||||
|
throw new Error('No hooks selected.');
|
||||||
|
}
|
||||||
|
return f;
|
||||||
|
};
|
||||||
|
const popupDetails = (h) => {
|
||||||
|
const nowSecs = new Date().getTime() / 1000;
|
||||||
|
popup(dom.h1('Details'), dom.div(dom._class('twocols'), dom.div(dom.table(dom.tr(dom.td('Message subject'), dom.td(h.Subject))), dom.br(), dom.h2('Results'), dom.table(dom.thead(dom.tr(dom.th('Start'), dom.th('Duration'), dom.th('Success'), dom.th('Code'), dom.th('Error'), dom.th('URL'), dom.th('Response'))), dom.tbody((h.Results || []).length === 0 ? dom.tr(dom.td(attr.colspan('7'), 'No results.')) : [], (h.Results || []).map(r => dom.tr(dom.td(age(r.Start, false, nowSecs)), dom.td(Math.round(r.Duration / 1000000) + 'ms'), dom.td(r.Success ? '✓' : ''), dom.td('' + (r.Code || '')), dom.td(r.Error), dom.td(r.URL), dom.td(r.Response))))), dom.br()), dom.div(dom.h2('Webhook JSON body'), dom.pre(dom._class('literal'), JSON.stringify(JSON.parse(h.Payload), undefined, '\t')))));
|
||||||
|
};
|
||||||
|
let tbody = dom.tbody();
|
||||||
|
const render = () => {
|
||||||
|
toggles = new Map();
|
||||||
|
for (const h of (hooks || [])) {
|
||||||
|
toggles.set(h.ID, dom.input(attr.type('checkbox'), (hooks || []).length === 1 ? attr.checked('') : []));
|
||||||
|
}
|
||||||
|
const ntbody = dom.tbody(dom._class('loadend'), hooks.length === 0 ? dom.tr(dom.td(attr.colspan('15'), 'No webhooks.')) : [], hooks.map(h => dom.tr(dom.td(toggles.get(h.ID)), dom.td('' + h.ID), dom.td(age(new Date(h.Submitted), false, nowSecs)), dom.td('' + (h.QueueMsgID || '')), // todo future: make it easy to open the corresponding (retired) message from queue (if still around).
|
||||||
|
dom.td('' + h.FromID), dom.td('' + h.MessageID), dom.td(h.Account || '-'), dom.td(h.IsIncoming ? "incoming" : h.OutgoingEvent), dom.td(formatExtra(h.Extra)), dom.td('' + h.Attempts), dom.td(age(h.NextAttempt, true, nowSecs)), dom.td(h.Results && h.Results.length > 0 ? age(h.Results[h.Results.length - 1].Start, false, nowSecs) : []), dom.td(h.Results && h.Results.length > 0 ? h.Results[h.Results.length - 1].Error : []), dom.td(h.URL), dom.td(dom.clickbutton('Details', function click() {
|
||||||
|
popupDetails(h);
|
||||||
|
})))));
|
||||||
|
tbody.replaceWith(ntbody);
|
||||||
|
tbody = ntbody;
|
||||||
|
};
|
||||||
|
render();
|
||||||
|
const buttonNextAttemptSet = (text, minutes) => dom.clickbutton(text, async function click(e) {
|
||||||
|
// note: awkward client call because gatherIDs() can throw an exception.
|
||||||
|
const n = await check(e.target, (async () => client.HookNextAttemptSet(gatherIDs(), minutes))());
|
||||||
|
window.alert('' + n + ' hook(s) updated');
|
||||||
|
window.location.reload(); // todo: reload less
|
||||||
|
});
|
||||||
|
const buttonNextAttemptAdd = (text, minutes) => dom.clickbutton(text, async function click(e) {
|
||||||
|
const n = await check(e.target, (async () => client.HookNextAttemptAdd(gatherIDs(), minutes))());
|
||||||
|
window.alert('' + n + ' hook(s) updated');
|
||||||
|
window.location.reload(); // todo: reload less
|
||||||
|
});
|
||||||
|
dom._kids(page, crumbs(crumblink('Mox Admin', '#'), 'Webhook queue'), dom.p(dom.a(attr.href('#webhookqueue/retired'), 'Retired webhooks')), dom.h2('Webhooks'), dom.table(dom._class('hover'), style({ width: '100%' }), dom.thead(dom.tr(dom.td(attr.colspan('2'), 'Filter'), dom.td(filterSubmitted = dom.input(attr.form('hooksfilter'), style({ width: '7em' }), attr.title('Example: "<-1h" for filtering webhooks submitted more than 1 hour ago.'))), dom.td(), dom.td(), dom.td(), dom.td(filterAccount = dom.input(attr.form('hooksfilter'), style({ width: '8em' }))), dom.td(filterEvent = dom.select(attr.form('hooksfilter'), function change() {
|
||||||
|
filterForm.requestSubmit();
|
||||||
|
}, dom.option(''),
|
||||||
|
// note: outgoing hook events are in ../webhook/webhook.go, ../mox-/config.go ../webadmin/admin.ts and ../webapi/gendoc.sh. keep in sync.
|
||||||
|
['incoming', 'delivered', 'suppressed', 'delayed', 'failed', 'relayed', 'expanded', 'canceled', 'unrecognized'].map(s => dom.option(s)))), dom.td(), dom.td(), dom.td(filterNextAttempt = dom.input(attr.form('hooksfilter'), style({ width: '7em' }), attr.title('Example: ">1h" for filtering webhooks to be delivered in more than 1 hour, or "<now" for webhooks to be delivered as soon as possible.'))), dom.td(), dom.td(), dom.td(attr.colspan('2'), style({ textAlign: 'right' }), // Less content shifting while rendering.
|
||||||
|
'Sort ', sortElem = dom.select(attr.form('hooksfilter'), function change() {
|
||||||
|
filterForm.requestSubmit();
|
||||||
|
}, dom.option('Next attempt ↑', attr.value('nextattempt-asc')), dom.option('Next attempt ↓', attr.value('nextattempt-desc')), dom.option('Submitted ↑', attr.value('submitted-asc')), dom.option('Submitted ↓', attr.value('submitted-desc'))), ' ', dom.submitbutton('Apply', attr.form('hooksfilter')), ' ', dom.clickbutton('Reset', attr.form('hooksfilter'), function click() {
|
||||||
|
filterForm.reset();
|
||||||
|
filterForm.requestSubmit();
|
||||||
|
}))), dom.tr(dom.td(dom.input(attr.type('checkbox'), (hooks || []).length === 1 ? attr.checked('') : [], attr.form('hooksfilter'), function change(e) {
|
||||||
|
const elem = e.target;
|
||||||
|
for (const [_, toggle] of toggles) {
|
||||||
|
toggle.checked = elem.checked;
|
||||||
|
}
|
||||||
|
})), dom.th('ID'), dom.th('Submitted'), dom.th('Queue Msg ID', attr.title('ID of queued message this event is about.')), dom.th('FromID'), dom.th('MessageID'), dom.th('Account'), dom.th('Event'), dom.th('Extra'), dom.th('Attempts'), dom.th('Next'), dom.th('Last'), dom.th('Error'), dom.th('URL'), dom.th('Actions'))), tbody, dom.tfoot(dom.tr(dom.td(attr.colspan('15'), dom.clickbutton('Load more', attr.title('Try to load more entries. You can still try to load more entries when at the end of the list, new entries may have been appended since the previous call.'), async function click(e) {
|
||||||
|
if (hooks.length === 0) {
|
||||||
|
sort.LastID = 0;
|
||||||
|
sort.Last = null;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
const last = hooks[hooks.length - 1];
|
||||||
|
sort.LastID = last.ID;
|
||||||
|
if (sort.Field === "Submitted") {
|
||||||
|
sort.Last = last.Submitted;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
sort.Last = last.NextAttempt;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
tbody.classList.add('loadstart');
|
||||||
|
const l = await check(e.target, client.HookList(filter, sort)) || [];
|
||||||
|
hooks.push(...l);
|
||||||
|
render();
|
||||||
|
}))))),
|
||||||
|
// Filtering.
|
||||||
|
filterForm = dom.form(attr.id('hooksfilter'), // Referenced by input elements in table row.
|
||||||
|
async function submit(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
e.stopPropagation();
|
||||||
|
filter = {
|
||||||
|
Max: filter.Max,
|
||||||
|
IDs: [],
|
||||||
|
Account: filterAccount.value,
|
||||||
|
Event: filterEvent.value,
|
||||||
|
Submitted: filterSubmitted.value,
|
||||||
|
NextAttempt: filterNextAttempt.value,
|
||||||
|
};
|
||||||
|
sort = {
|
||||||
|
Field: sortElem.value.startsWith('nextattempt') ? 'NextAttempt' : 'Submitted',
|
||||||
|
LastID: 0,
|
||||||
|
Last: null,
|
||||||
|
Asc: sortElem.value.endsWith('asc'),
|
||||||
|
};
|
||||||
|
tbody.classList.add('loadstart');
|
||||||
|
hooks = await check({ disabled: false }, client.HookList(filter, sort)) || [];
|
||||||
|
render();
|
||||||
|
}), dom.br(), dom.br(), dom.div(dom._class('unclutter'), dom.h2('Change selected webhooks'), dom.div(style({ display: 'flex', gap: '2em' }), dom.div(dom.div('Schedule next delivery attempt'), buttonNextAttemptSet('Now', 0), ' ', dom.clickbutton('More...', function click(e) {
|
||||||
|
e.target.replaceWith(dom.div(dom.br(), dom.div('Scheduled time plus'), dom.div(buttonNextAttemptAdd('1m', 1), ' ', buttonNextAttemptAdd('5m', 5), ' ', buttonNextAttemptAdd('30m', 30), ' ', buttonNextAttemptAdd('1h', 60), ' ', buttonNextAttemptAdd('2h', 2 * 60), ' ', buttonNextAttemptAdd('4h', 4 * 60), ' ', buttonNextAttemptAdd('8h', 8 * 60), ' ', buttonNextAttemptAdd('16h', 16 * 60), ' '), dom.br(), dom.div('Now plus'), dom.div(buttonNextAttemptSet('1m', 1), ' ', buttonNextAttemptSet('5m', 5), ' ', buttonNextAttemptSet('30m', 30), ' ', buttonNextAttemptSet('1h', 60), ' ', buttonNextAttemptSet('2h', 2 * 60), ' ', buttonNextAttemptSet('4h', 4 * 60), ' ', buttonNextAttemptSet('8h', 8 * 60), ' ', buttonNextAttemptSet('16h', 16 * 60), ' ')));
|
||||||
|
})), dom.div(dom.div('Delivery'), dom.clickbutton('Cancel', attr.title('Retires webhooks, preventing further delivery attempts.'), async function click(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
if (!window.confirm('Are you sure you want to cancel these webhooks?')) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const n = await check(e.target, (async () => await client.HookCancel(gatherIDs()))());
|
||||||
|
window.alert('' + n + ' webhook(s) updated');
|
||||||
|
window.location.reload(); // todo: only refresh the list
|
||||||
|
})))));
|
||||||
|
};
|
||||||
|
const hooksRetiredList = async () => {
|
||||||
|
let filter = { Max: parseInt(localStorageGet('adminpaginationsize') || '') || 100, IDs: [], Account: '', Submitted: '', LastActivity: '', Event: '' };
|
||||||
|
let sort = { Field: "LastActivity", LastID: 0, Last: null, Asc: false };
|
||||||
|
let hooks = await client.HookRetiredList(filter, sort) || [];
|
||||||
|
const nowSecs = new Date().getTime() / 1000;
|
||||||
|
let sortElem;
|
||||||
|
let filterForm;
|
||||||
|
let filterSubmitted;
|
||||||
|
let filterAccount;
|
||||||
|
let filterEvent;
|
||||||
|
let filterLastActivity;
|
||||||
|
const popupDetails = (h) => {
|
||||||
|
const nowSecs = new Date().getTime() / 1000;
|
||||||
|
popup(dom.h1('Details'), dom.div(dom._class('twocols'), dom.div(dom.table(dom.tr(dom.td('Message subject'), dom.td(h.Subject)), h.SupersededByID != 0 ? dom.tr(dom.td('Superseded by webhook ID'), dom.td('' + h.SupersededByID)) : []), dom.br(), dom.h2('Results'), dom.table(dom.thead(dom.tr(dom.th('Start'), dom.th('Duration'), dom.th('Success'), dom.th('Code'), dom.th('Error'), dom.th('URL'), dom.th('Response'))), dom.tbody((h.Results || []).length === 0 ? dom.tr(dom.td(attr.colspan('7'), 'No results.')) : [], (h.Results || []).map(r => dom.tr(dom.td(age(r.Start, false, nowSecs)), dom.td(Math.round(r.Duration / 1000000) + 'ms'), dom.td(r.Success ? '✓' : ''), dom.td('' + (r.Code || '')), dom.td(r.Error), dom.td(r.URL), dom.td(r.Response))))), dom.br()), dom.div(dom.h2('Webhook JSON body'), dom.pre(dom._class('literal'), JSON.stringify(JSON.parse(h.Payload), undefined, '\t')))));
|
||||||
|
};
|
||||||
|
let tbody = dom.tbody();
|
||||||
|
// todo future: add selection + button to reschedule old retired webhooks.
|
||||||
|
const render = () => {
|
||||||
|
const ntbody = dom.tbody(dom._class('loadend'), hooks.length === 0 ? dom.tr(dom.td(attr.colspan('14'), 'No retired webhooks.')) : [], hooks.map(h => dom.tr(dom.td('' + h.ID), dom.td(h.Success ? '✓' : ''), dom.td(age(h.LastActivity, false, nowSecs)), dom.td(age(new Date(h.Submitted), false, nowSecs)), dom.td('' + (h.QueueMsgID || '')), dom.td('' + h.FromID), dom.td('' + h.MessageID), dom.td(h.Account || '-'), dom.td(h.IsIncoming ? "incoming" : h.OutgoingEvent), dom.td(formatExtra(h.Extra)), dom.td('' + h.Attempts), dom.td(h.Results && h.Results.length > 0 ? h.Results[h.Results.length - 1].Error : []), dom.td(h.URL), dom.td(dom.clickbutton('Details', function click() {
|
||||||
|
popupDetails(h);
|
||||||
|
})))));
|
||||||
|
tbody.replaceWith(ntbody);
|
||||||
|
tbody = ntbody;
|
||||||
|
};
|
||||||
|
render();
|
||||||
|
dom._kids(page, crumbs(crumblink('Mox Admin', '#'), crumblink('Webhook queue', '#webhookqueue'), 'Retired webhooks'), dom.h2('Retired webhooks'), dom.table(dom._class('hover'), style({ width: '100%' }), dom.thead(dom.tr(dom.td('Filter'), dom.td(), dom.td(filterLastActivity = dom.input(attr.form('hooksfilter'), style({ width: '7em' }), attr.title('Example: ">-1h" for filtering last activity for webhooks more than 1 hour ago.'))), dom.td(filterSubmitted = dom.input(attr.form('hooksfilter'), style({ width: '7em' }), attr.title('Example: "<-1h" for filtering webhooks submitted more than 1 hour ago.'))), dom.td(), dom.td(), dom.td(), dom.td(filterAccount = dom.input(attr.form('hooksfilter'), style({ width: '8em' }))), dom.td(filterEvent = dom.select(attr.form('hooksfilter'), function change() {
|
||||||
|
filterForm.requestSubmit();
|
||||||
|
}, dom.option(''),
|
||||||
|
// note: outgoing hook events are in ../webhook/webhook.go, ../mox-/config.go ../webadmin/admin.ts and ../webapi/gendoc.sh. keep in sync.
|
||||||
|
['incoming', 'delivered', 'suppressed', 'delayed', 'failed', 'relayed', 'expanded', 'canceled', 'unrecognized'].map(s => dom.option(s)))), dom.td(), dom.td(), dom.td(), dom.td(attr.colspan('2'), style({ textAlign: 'right' }), // Less content shifting while rendering.
|
||||||
|
'Sort ', sortElem = dom.select(attr.form('hooksfilter'), function change() {
|
||||||
|
filterForm.requestSubmit();
|
||||||
|
}, dom.option('Last activity ↓', attr.value('nextattempt-desc')), dom.option('Last activity ↑', attr.value('nextattempt-asc')), dom.option('Submitted ↓', attr.value('submitted-desc')), dom.option('Submitted ↑', attr.value('submitted-asc'))), ' ', dom.submitbutton('Apply', attr.form('hooksfilter')), ' ', dom.clickbutton('Reset', attr.form('hooksfilter'), function click() {
|
||||||
|
filterForm.reset();
|
||||||
|
filterForm.requestSubmit();
|
||||||
|
}))), dom.tr(dom.th('ID'), dom.th('Success'), dom.th('Last'), dom.th('Submitted'), dom.th('Queue Msg ID', attr.title('ID of queued message this event is about.')), dom.th('FromID'), dom.th('MessageID'), dom.th('Account'), dom.th('Event'), dom.th('Extra'), dom.th('Attempts'), dom.th('Error'), dom.th('URL'), dom.th('Actions'))), tbody, dom.tfoot(dom.tr(dom.td(attr.colspan('14'), dom.clickbutton('Load more', attr.title('Try to load more entries. You can still try to load more entries when at the end of the list, new entries may have been appended since the previous call.'), async function click(e) {
|
||||||
|
if (hooks.length === 0) {
|
||||||
|
sort.LastID = 0;
|
||||||
|
sort.Last = null;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
const last = hooks[hooks.length - 1];
|
||||||
|
sort.LastID = last.ID;
|
||||||
|
if (sort.Field === "Submitted") {
|
||||||
|
sort.Last = last.Submitted;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
sort.Last = last.LastActivity;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
tbody.classList.add('loadstart');
|
||||||
|
const l = await check(e.target, client.HookRetiredList(filter, sort)) || [];
|
||||||
|
hooks.push(...l);
|
||||||
|
render();
|
||||||
|
}))))),
|
||||||
|
// Filtering.
|
||||||
|
filterForm = dom.form(attr.id('hooksfilter'), // Referenced by input elements in table row.
|
||||||
|
async function submit(e) {
|
||||||
|
e.preventDefault();
|
||||||
|
e.stopPropagation();
|
||||||
|
filter = {
|
||||||
|
Max: filter.Max,
|
||||||
|
IDs: [],
|
||||||
|
Account: filterAccount.value,
|
||||||
|
Event: filterEvent.value,
|
||||||
|
Submitted: filterSubmitted.value,
|
||||||
|
LastActivity: filterLastActivity.value,
|
||||||
|
};
|
||||||
|
sort = {
|
||||||
|
Field: sortElem.value.startsWith('lastactivity') ? 'LastActivity' : 'Submitted',
|
||||||
|
LastID: 0,
|
||||||
|
Last: null,
|
||||||
|
Asc: sortElem.value.endsWith('asc'),
|
||||||
|
};
|
||||||
|
tbody.classList.add('loadstart');
|
||||||
|
hooks = await check({ disabled: false }, client.HookRetiredList(filter, sort)) || [];
|
||||||
|
render();
|
||||||
|
}));
|
||||||
};
|
};
|
||||||
const webserver = async () => {
|
const webserver = async () => {
|
||||||
let conf = await client.WebserverConfig();
|
let conf = await client.WebserverConfig();
|
||||||
|
@ -3072,6 +3545,15 @@ const init = async () => {
|
||||||
else if (h === 'queue') {
|
else if (h === 'queue') {
|
||||||
await queueList();
|
await queueList();
|
||||||
}
|
}
|
||||||
|
else if (h === 'queue/retired') {
|
||||||
|
await retiredList();
|
||||||
|
}
|
||||||
|
else if (h === 'webhookqueue') {
|
||||||
|
await hooksList();
|
||||||
|
}
|
||||||
|
else if (h === 'webhookqueue/retired') {
|
||||||
|
await hooksRetiredList();
|
||||||
|
}
|
||||||
else if (h === 'tlsrpt') {
|
else if (h === 'tlsrpt') {
|
||||||
await tlsrptIndex();
|
await tlsrptIndex();
|
||||||
}
|
}
|
||||||
|
|
1229
webadmin/admin.ts
1229
webadmin/admin.ts
File diff suppressed because it is too large
Load diff
|
@ -12,6 +12,7 @@ import (
|
||||||
"net/http/httptest"
|
"net/http/httptest"
|
||||||
"os"
|
"os"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
|
"reflect"
|
||||||
"runtime/debug"
|
"runtime/debug"
|
||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
@ -25,6 +26,7 @@ import (
|
||||||
"github.com/mjl-/mox/dns"
|
"github.com/mjl-/mox/dns"
|
||||||
"github.com/mjl-/mox/mlog"
|
"github.com/mjl-/mox/mlog"
|
||||||
"github.com/mjl-/mox/mox-"
|
"github.com/mjl-/mox/mox-"
|
||||||
|
"github.com/mjl-/mox/queue"
|
||||||
"github.com/mjl-/mox/store"
|
"github.com/mjl-/mox/store"
|
||||||
"github.com/mjl-/mox/webauth"
|
"github.com/mjl-/mox/webauth"
|
||||||
)
|
)
|
||||||
|
@ -64,6 +66,13 @@ func tcheck(t *testing.T, err error, msg string) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func tcompare(t *testing.T, got, expect any) {
|
||||||
|
t.Helper()
|
||||||
|
if !reflect.DeepEqual(got, expect) {
|
||||||
|
t.Fatalf("got:\n%#v\nexpected:\n%#v", got, expect)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func readBody(r io.Reader) string {
|
func readBody(r io.Reader) string {
|
||||||
buf, err := io.ReadAll(r)
|
buf, err := io.ReadAll(r)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
|
@ -200,6 +209,30 @@ func TestAdminAuth(t *testing.T) {
|
||||||
|
|
||||||
api.Logout(ctx)
|
api.Logout(ctx)
|
||||||
tneedErrorCode(t, "server:error", func() { api.Logout(ctx) })
|
tneedErrorCode(t, "server:error", func() { api.Logout(ctx) })
|
||||||
|
|
||||||
|
err = queue.Init()
|
||||||
|
tcheck(t, err, "queue init")
|
||||||
|
|
||||||
|
mrl := api.RetiredList(ctxbg, queue.RetiredFilter{}, queue.RetiredSort{})
|
||||||
|
tcompare(t, len(mrl), 0)
|
||||||
|
|
||||||
|
n := api.HookQueueSize(ctxbg)
|
||||||
|
tcompare(t, n, 0)
|
||||||
|
|
||||||
|
hl := api.HookList(ctxbg, queue.HookFilter{}, queue.HookSort{})
|
||||||
|
tcompare(t, len(hl), 0)
|
||||||
|
|
||||||
|
n = api.HookNextAttemptSet(ctxbg, queue.HookFilter{}, 0)
|
||||||
|
tcompare(t, n, 0)
|
||||||
|
|
||||||
|
n = api.HookNextAttemptAdd(ctxbg, queue.HookFilter{}, 0)
|
||||||
|
tcompare(t, n, 0)
|
||||||
|
|
||||||
|
hrl := api.HookRetiredList(ctxbg, queue.HookRetiredFilter{}, queue.HookRetiredSort{})
|
||||||
|
tcompare(t, len(hrl), 0)
|
||||||
|
|
||||||
|
n = api.HookCancel(ctxbg, queue.HookFilter{})
|
||||||
|
tcompare(t, n, 0)
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestCheckDomain(t *testing.T) {
|
func TestCheckDomain(t *testing.T) {
|
||||||
|
|
1242
webadmin/api.json
1242
webadmin/api.json
File diff suppressed because it is too large
Load diff
310
webadmin/api.ts
310
webadmin/api.ts
|
@ -267,6 +267,11 @@ export interface AutodiscoverSRV {
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface Account {
|
export interface Account {
|
||||||
|
OutgoingWebhook?: OutgoingWebhook | null
|
||||||
|
IncomingWebhook?: IncomingWebhook | null
|
||||||
|
FromIDLoginAddresses?: string[] | null
|
||||||
|
KeepRetiredMessagePeriod: number
|
||||||
|
KeepRetiredWebhookPeriod: number
|
||||||
Domain: string
|
Domain: string
|
||||||
Description: string
|
Description: string
|
||||||
FullName: string
|
FullName: string
|
||||||
|
@ -284,6 +289,17 @@ export interface Account {
|
||||||
DNSDomain: Domain // Parsed form of Domain.
|
DNSDomain: Domain // Parsed form of Domain.
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export interface OutgoingWebhook {
|
||||||
|
URL: string
|
||||||
|
Authorization: string
|
||||||
|
Events?: string[] | null
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface IncomingWebhook {
|
||||||
|
URL: string
|
||||||
|
Authorization: string
|
||||||
|
}
|
||||||
|
|
||||||
export interface Destination {
|
export interface Destination {
|
||||||
Mailbox: string
|
Mailbox: string
|
||||||
Rulesets?: Ruleset[] | null
|
Rulesets?: Ruleset[] | null
|
||||||
|
@ -550,6 +566,7 @@ export interface HoldRule {
|
||||||
// Only non-empty/non-zero values are applied to the filter. Leaving all fields
|
// Only non-empty/non-zero values are applied to the filter. Leaving all fields
|
||||||
// empty/zero matches all messages.
|
// empty/zero matches all messages.
|
||||||
export interface Filter {
|
export interface Filter {
|
||||||
|
Max: number
|
||||||
IDs?: number[] | null
|
IDs?: number[] | null
|
||||||
Account: string
|
Account: string
|
||||||
From: string
|
From: string
|
||||||
|
@ -560,6 +577,13 @@ export interface Filter {
|
||||||
Transport?: string | null
|
Transport?: string | null
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export interface Sort {
|
||||||
|
Field: string // "Queued" or "NextAttempt"/"".
|
||||||
|
LastID: number // If > 0, we return objects beyond this, less/greater depending on Asc.
|
||||||
|
Last: any // Value of Field for last object. Must be set iff LastID is set.
|
||||||
|
Asc: boolean // Ascending, or descending.
|
||||||
|
}
|
||||||
|
|
||||||
// Msg is a message in the queue.
|
// Msg is a message in the queue.
|
||||||
//
|
//
|
||||||
// Use MakeMsg to make a message with fields that Add needs. Add will further set
|
// Use MakeMsg to make a message with fields that Add needs. Add will further set
|
||||||
|
@ -573,26 +597,29 @@ export interface Msg {
|
||||||
SenderLocalpart: Localpart // Should be a local user and domain.
|
SenderLocalpart: Localpart // Should be a local user and domain.
|
||||||
SenderDomain: IPDomain
|
SenderDomain: IPDomain
|
||||||
SenderDomainStr: string // For filtering, unicode.
|
SenderDomainStr: string // For filtering, unicode.
|
||||||
|
FromID: string // For transactional messages, used to match later DSNs.
|
||||||
RecipientLocalpart: Localpart // Typically a remote user and domain.
|
RecipientLocalpart: Localpart // Typically a remote user and domain.
|
||||||
RecipientDomain: IPDomain
|
RecipientDomain: IPDomain
|
||||||
RecipientDomainStr: string // For filtering, unicode.
|
RecipientDomainStr: string // For filtering, unicode domain. Can also contain ip enclosed in [].
|
||||||
Attempts: number // Next attempt is based on last attempt and exponential back off based on attempts.
|
Attempts: number // Next attempt is based on last attempt and exponential back off based on attempts.
|
||||||
MaxAttempts: number // Max number of attempts before giving up. If 0, then the default of 8 attempts is used instead.
|
MaxAttempts: number // Max number of attempts before giving up. If 0, then the default of 8 attempts is used instead.
|
||||||
DialedIPs?: { [key: string]: IP[] | null } // For each host, the IPs that were dialed. Used for IP selection for later attempts.
|
DialedIPs?: { [key: string]: IP[] | null } // For each host, the IPs that were dialed. Used for IP selection for later attempts.
|
||||||
NextAttempt: Date // For scheduling.
|
NextAttempt: Date // For scheduling.
|
||||||
LastAttempt?: Date | null
|
LastAttempt?: Date | null
|
||||||
LastError: string
|
Results?: MsgResult[] | null
|
||||||
Has8bit: boolean // Whether message contains bytes with high bit set, determines whether 8BITMIME SMTP extension is needed.
|
Has8bit: boolean // Whether message contains bytes with high bit set, determines whether 8BITMIME SMTP extension is needed.
|
||||||
SMTPUTF8: boolean // Whether message requires use of SMTPUTF8.
|
SMTPUTF8: boolean // Whether message requires use of SMTPUTF8.
|
||||||
IsDMARCReport: boolean // Delivery failures for DMARC reports are handled differently.
|
IsDMARCReport: boolean // Delivery failures for DMARC reports are handled differently.
|
||||||
IsTLSReport: boolean // Delivery failures for TLS reports are handled differently.
|
IsTLSReport: boolean // Delivery failures for TLS reports are handled differently.
|
||||||
Size: number // Full size of message, combined MsgPrefix with contents of message file.
|
Size: number // Full size of message, combined MsgPrefix with contents of message file.
|
||||||
MessageID: string // Used when composing a DSN, in its References header.
|
MessageID: string // Message-ID header, including <>. Used when composing a DSN, in its References header.
|
||||||
MsgPrefix?: string | null
|
MsgPrefix?: string | null // Data to send before the contents from the file, typically with headers like DKIM-Signature.
|
||||||
|
Subject: string // For context about delivery.
|
||||||
DSNUTF8?: string | null // If set, this message is a DSN and this is a version using utf-8, for the case the remote MTA supports smtputf8. In this case, Size and MsgPrefix are not relevant.
|
DSNUTF8?: string | null // If set, this message is a DSN and this is a version using utf-8, for the case the remote MTA supports smtputf8. In this case, Size and MsgPrefix are not relevant.
|
||||||
Transport: string // If non-empty, the transport to use for this message. Can be set through cli or admin interface. If empty (the default for a submitted message), regular routing rules apply.
|
Transport: string // If non-empty, the transport to use for this message. Can be set through cli or admin interface. If empty (the default for a submitted message), regular routing rules apply.
|
||||||
RequireTLS?: boolean | null // RequireTLS influences TLS verification during delivery. If nil, the recipient domain policy is followed (MTA-STS and/or DANE), falling back to optional opportunistic non-verified STARTTLS. If RequireTLS is true (through SMTP REQUIRETLS extension or webmail submit), MTA-STS or DANE is required, as well as REQUIRETLS support by the next hop server. If RequireTLS is false (through messag header "TLS-Required: No"), the recipient domain's policy is ignored if it does not lead to a successful TLS connection, i.e. falling back to SMTP delivery with unverified STARTTLS or plain text.
|
RequireTLS?: boolean | null // RequireTLS influences TLS verification during delivery. If nil, the recipient domain policy is followed (MTA-STS and/or DANE), falling back to optional opportunistic non-verified STARTTLS. If RequireTLS is true (through SMTP REQUIRETLS extension or webmail submit), MTA-STS or DANE is required, as well as REQUIRETLS support by the next hop server. If RequireTLS is false (through messag header "TLS-Required: No"), the recipient domain's policy is ignored if it does not lead to a successful TLS connection, i.e. falling back to SMTP delivery with unverified STARTTLS or plain text.
|
||||||
FutureReleaseRequest: string // For DSNs, where the original FUTURERELEASE value must be included as per-message field. This field should be of the form "for;" plus interval, or "until;" plus utc date-time.
|
FutureReleaseRequest: string // For DSNs, where the original FUTURERELEASE value must be included as per-message field. This field should be of the form "for;" plus interval, or "until;" plus utc date-time.
|
||||||
|
Extra?: { [key: string]: string } // Extra information, for transactional email.
|
||||||
}
|
}
|
||||||
|
|
||||||
// IPDomain is an ip address, a domain, or empty.
|
// IPDomain is an ip address, a domain, or empty.
|
||||||
|
@ -601,6 +628,173 @@ export interface IPDomain {
|
||||||
Domain: Domain
|
Domain: Domain
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// MsgResult is the result (or work in progress) of a delivery attempt.
|
||||||
|
export interface MsgResult {
|
||||||
|
Start: Date
|
||||||
|
Duration: number
|
||||||
|
Success: boolean
|
||||||
|
Code: number
|
||||||
|
Secode: string
|
||||||
|
Error: string
|
||||||
|
}
|
||||||
|
|
||||||
|
// RetiredFilter filters messages to list or operate on. Used by admin web interface
|
||||||
|
// and cli.
|
||||||
|
//
|
||||||
|
// Only non-empty/non-zero values are applied to the filter. Leaving all fields
|
||||||
|
// empty/zero matches all messages.
|
||||||
|
export interface RetiredFilter {
|
||||||
|
Max: number
|
||||||
|
IDs?: number[] | null
|
||||||
|
Account: string
|
||||||
|
From: string
|
||||||
|
To: string
|
||||||
|
Submitted: string // Whether submitted before/after a time relative to now. ">$duration" or "<$duration", also with "now" for duration.
|
||||||
|
LastActivity: string // ">$duration" or "<$duration", also with "now" for duration.
|
||||||
|
Transport?: string | null
|
||||||
|
Success?: boolean | null
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface RetiredSort {
|
||||||
|
Field: string // "Queued" or "LastActivity"/"".
|
||||||
|
LastID: number // If > 0, we return objects beyond this, less/greater depending on Asc.
|
||||||
|
Last: any // Value of Field for last object. Must be set iff LastID is set.
|
||||||
|
Asc: boolean // Ascending, or descending.
|
||||||
|
}
|
||||||
|
|
||||||
|
// MsgRetired is a message for which delivery completed, either successful,
|
||||||
|
// failed/canceled. Retired messages are only stored if so configured, and will be
|
||||||
|
// cleaned up after the configured period.
|
||||||
|
export interface MsgRetired {
|
||||||
|
ID: number // Same ID as it was as Msg.ID.
|
||||||
|
BaseID: number
|
||||||
|
Queued: Date
|
||||||
|
SenderAccount: string // Failures are delivered back to this local account. Also used for routing.
|
||||||
|
SenderLocalpart: Localpart // Should be a local user and domain.
|
||||||
|
SenderDomainStr: string // For filtering, unicode.
|
||||||
|
FromID: string // Used to match DSNs.
|
||||||
|
RecipientLocalpart: Localpart // Typically a remote user and domain.
|
||||||
|
RecipientDomain: IPDomain
|
||||||
|
RecipientDomainStr: string // For filtering, unicode.
|
||||||
|
Attempts: number // Next attempt is based on last attempt and exponential back off based on attempts.
|
||||||
|
MaxAttempts: number // Max number of attempts before giving up. If 0, then the default of 8 attempts is used instead.
|
||||||
|
DialedIPs?: { [key: string]: IP[] | null } // For each host, the IPs that were dialed. Used for IP selection for later attempts.
|
||||||
|
LastAttempt?: Date | null
|
||||||
|
Results?: MsgResult[] | null
|
||||||
|
Has8bit: boolean // Whether message contains bytes with high bit set, determines whether 8BITMIME SMTP extension is needed.
|
||||||
|
SMTPUTF8: boolean // Whether message requires use of SMTPUTF8.
|
||||||
|
IsDMARCReport: boolean // Delivery failures for DMARC reports are handled differently.
|
||||||
|
IsTLSReport: boolean // Delivery failures for TLS reports are handled differently.
|
||||||
|
Size: number // Full size of message, combined MsgPrefix with contents of message file.
|
||||||
|
MessageID: string // Used when composing a DSN, in its References header.
|
||||||
|
Subject: string // For context about delivery.
|
||||||
|
Transport: string
|
||||||
|
RequireTLS?: boolean | null
|
||||||
|
FutureReleaseRequest: string
|
||||||
|
Extra?: { [key: string]: string } // Extra information, for transactional email.
|
||||||
|
LastActivity: Date
|
||||||
|
RecipientAddress: string
|
||||||
|
Success: boolean // Whether delivery to next hop succeeded.
|
||||||
|
KeepUntil: Date
|
||||||
|
}
|
||||||
|
|
||||||
|
// HookFilter filters messages to list or operate on. Used by admin web interface
|
||||||
|
// and cli.
|
||||||
|
//
|
||||||
|
// Only non-empty/non-zero values are applied to the filter. Leaving all fields
|
||||||
|
// empty/zero matches all hooks.
|
||||||
|
export interface HookFilter {
|
||||||
|
Max: number
|
||||||
|
IDs?: number[] | null
|
||||||
|
Account: string
|
||||||
|
Submitted: string // Whether submitted before/after a time relative to now. ">$duration" or "<$duration", also with "now" for duration.
|
||||||
|
NextAttempt: string // ">$duration" or "<$duration", also with "now" for duration.
|
||||||
|
Event: string // Including "incoming".
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface HookSort {
|
||||||
|
Field: string // "Queued" or "NextAttempt"/"".
|
||||||
|
LastID: number // If > 0, we return objects beyond this, less/greater depending on Asc.
|
||||||
|
Last: any // Value of Field for last object. Must be set iff LastID is set.
|
||||||
|
Asc: boolean // Ascending, or descending.
|
||||||
|
}
|
||||||
|
|
||||||
|
// Hook is a webhook call about a delivery. We'll try delivering with backoff until we succeed or fail.
|
||||||
|
export interface Hook {
|
||||||
|
ID: number
|
||||||
|
QueueMsgID: number // Original queue Msg/MsgRetired ID. Zero for hooks for incoming messages.
|
||||||
|
FromID: string // As generated by us and returned in webapi call. Can be empty, for incoming messages to our base address.
|
||||||
|
MessageID: string // Of outgoing or incoming messages. Includes <>.
|
||||||
|
Subject: string // Subject of original outgoing message, or of incoming message.
|
||||||
|
Extra?: { [key: string]: string } // From submitted message.
|
||||||
|
Account: string
|
||||||
|
URL: string // Taken from config when webhook is scheduled.
|
||||||
|
Authorization: string // Optional value for authorization header to include in HTTP request.
|
||||||
|
IsIncoming: boolean
|
||||||
|
OutgoingEvent: string // Empty string if not outgoing.
|
||||||
|
Payload: string // JSON data to be submitted.
|
||||||
|
Submitted: Date
|
||||||
|
Attempts: number
|
||||||
|
NextAttempt: Date // Index for fast scheduling.
|
||||||
|
Results?: HookResult[] | null
|
||||||
|
}
|
||||||
|
|
||||||
|
// HookResult is the result of a single attempt to deliver a webhook.
|
||||||
|
export interface HookResult {
|
||||||
|
Start: Date
|
||||||
|
Duration: number
|
||||||
|
URL: string
|
||||||
|
Success: boolean
|
||||||
|
Code: number // eg 200, 404, 500. 2xx implies success.
|
||||||
|
Error: string
|
||||||
|
Response: string // Max 512 bytes of HTTP response body.
|
||||||
|
}
|
||||||
|
|
||||||
|
// HookRetiredFilter filters messages to list or operate on. Used by admin web interface
|
||||||
|
// and cli.
|
||||||
|
//
|
||||||
|
// Only non-empty/non-zero values are applied to the filter. Leaving all fields
|
||||||
|
// empty/zero matches all hooks.
|
||||||
|
export interface HookRetiredFilter {
|
||||||
|
Max: number
|
||||||
|
IDs?: number[] | null
|
||||||
|
Account: string
|
||||||
|
Submitted: string // Whether submitted before/after a time relative to now. ">$duration" or "<$duration", also with "now" for duration.
|
||||||
|
LastActivity: string // ">$duration" or "<$duration", also with "now" for duration.
|
||||||
|
Event: string // Including "incoming".
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface HookRetiredSort {
|
||||||
|
Field: string // "Queued" or "LastActivity"/"".
|
||||||
|
LastID: number // If > 0, we return objects beyond this, less/greater depending on Asc.
|
||||||
|
Last: any // Value of Field for last object. Must be set iff LastID is set.
|
||||||
|
Asc: boolean // Ascending, or descending.
|
||||||
|
}
|
||||||
|
|
||||||
|
// HookRetired is a Hook that was delivered/failed/canceled and kept according
|
||||||
|
// to the configuration.
|
||||||
|
export interface HookRetired {
|
||||||
|
ID: number // Same as original Hook.ID.
|
||||||
|
QueueMsgID: number // Original queue Msg or MsgRetired ID. Zero for hooks for incoming messages.
|
||||||
|
FromID: string // As generated by us and returned in webapi call. Can be empty, for incoming messages to our base address.
|
||||||
|
MessageID: string // Of outgoing or incoming messages. Includes <>.
|
||||||
|
Subject: string // Subject of original outgoing message, or of incoming message.
|
||||||
|
Extra?: { [key: string]: string } // From submitted message.
|
||||||
|
Account: string
|
||||||
|
URL: string // Taken from config at start of each attempt.
|
||||||
|
Authorization: boolean // Whether request had authorization without keeping it around.
|
||||||
|
IsIncoming: boolean
|
||||||
|
OutgoingEvent: string
|
||||||
|
Payload: string // JSON data submitted.
|
||||||
|
Submitted: Date
|
||||||
|
SupersededByID: number // If not 0, a Hook.ID that superseded this one and Done will be true.
|
||||||
|
Attempts: number
|
||||||
|
Results?: HookResult[] | null
|
||||||
|
Success: boolean
|
||||||
|
LastActivity: Date
|
||||||
|
KeepUntil: Date
|
||||||
|
}
|
||||||
|
|
||||||
// WebserverConfig is the combination of WebDomainRedirects and WebHandlers
|
// WebserverConfig is the combination of WebDomainRedirects and WebHandlers
|
||||||
// from the domains.conf configuration file.
|
// from the domains.conf configuration file.
|
||||||
export interface WebserverConfig {
|
export interface WebserverConfig {
|
||||||
|
@ -883,7 +1077,7 @@ export type Localpart = string
|
||||||
// be an IPv4 address.
|
// be an IPv4 address.
|
||||||
export type IP = string
|
export type IP = string
|
||||||
|
|
||||||
export const structTypes: {[typename: string]: boolean} = {"Account":true,"AuthResults":true,"AutoconfCheckResult":true,"AutodiscoverCheckResult":true,"AutodiscoverSRV":true,"AutomaticJunkFlags":true,"CheckResult":true,"ClientConfigs":true,"ClientConfigsEntry":true,"DANECheckResult":true,"DKIMAuthResult":true,"DKIMCheckResult":true,"DKIMRecord":true,"DMARCCheckResult":true,"DMARCRecord":true,"DMARCSummary":true,"DNSSECResult":true,"DateRange":true,"Destination":true,"Directive":true,"Domain":true,"DomainFeedback":true,"Evaluation":true,"EvaluationStat":true,"Extension":true,"FailureDetails":true,"Filter":true,"HoldRule":true,"IPDomain":true,"IPRevCheckResult":true,"Identifiers":true,"JunkFilter":true,"MTASTSCheckResult":true,"MTASTSRecord":true,"MX":true,"MXCheckResult":true,"Modifier":true,"Msg":true,"Pair":true,"Policy":true,"PolicyEvaluated":true,"PolicyOverrideReason":true,"PolicyPublished":true,"PolicyRecord":true,"Record":true,"Report":true,"ReportMetadata":true,"ReportRecord":true,"Result":true,"ResultPolicy":true,"Reverse":true,"Route":true,"Row":true,"Ruleset":true,"SMTPAuth":true,"SPFAuthResult":true,"SPFCheckResult":true,"SPFRecord":true,"SRV":true,"SRVConfCheckResult":true,"STSMX":true,"SubjectPass":true,"Summary":true,"SuppressAddress":true,"TLSCheckResult":true,"TLSRPTCheckResult":true,"TLSRPTDateRange":true,"TLSRPTRecord":true,"TLSRPTSummary":true,"TLSRPTSuppressAddress":true,"TLSReportRecord":true,"TLSResult":true,"Transport":true,"TransportDirect":true,"TransportSMTP":true,"TransportSocks":true,"URI":true,"WebForward":true,"WebHandler":true,"WebRedirect":true,"WebStatic":true,"WebserverConfig":true}
|
export const structTypes: {[typename: string]: boolean} = {"Account":true,"AuthResults":true,"AutoconfCheckResult":true,"AutodiscoverCheckResult":true,"AutodiscoverSRV":true,"AutomaticJunkFlags":true,"CheckResult":true,"ClientConfigs":true,"ClientConfigsEntry":true,"DANECheckResult":true,"DKIMAuthResult":true,"DKIMCheckResult":true,"DKIMRecord":true,"DMARCCheckResult":true,"DMARCRecord":true,"DMARCSummary":true,"DNSSECResult":true,"DateRange":true,"Destination":true,"Directive":true,"Domain":true,"DomainFeedback":true,"Evaluation":true,"EvaluationStat":true,"Extension":true,"FailureDetails":true,"Filter":true,"HoldRule":true,"Hook":true,"HookFilter":true,"HookResult":true,"HookRetired":true,"HookRetiredFilter":true,"HookRetiredSort":true,"HookSort":true,"IPDomain":true,"IPRevCheckResult":true,"Identifiers":true,"IncomingWebhook":true,"JunkFilter":true,"MTASTSCheckResult":true,"MTASTSRecord":true,"MX":true,"MXCheckResult":true,"Modifier":true,"Msg":true,"MsgResult":true,"MsgRetired":true,"OutgoingWebhook":true,"Pair":true,"Policy":true,"PolicyEvaluated":true,"PolicyOverrideReason":true,"PolicyPublished":true,"PolicyRecord":true,"Record":true,"Report":true,"ReportMetadata":true,"ReportRecord":true,"Result":true,"ResultPolicy":true,"RetiredFilter":true,"RetiredSort":true,"Reverse":true,"Route":true,"Row":true,"Ruleset":true,"SMTPAuth":true,"SPFAuthResult":true,"SPFCheckResult":true,"SPFRecord":true,"SRV":true,"SRVConfCheckResult":true,"STSMX":true,"Sort":true,"SubjectPass":true,"Summary":true,"SuppressAddress":true,"TLSCheckResult":true,"TLSRPTCheckResult":true,"TLSRPTDateRange":true,"TLSRPTRecord":true,"TLSRPTSummary":true,"TLSRPTSuppressAddress":true,"TLSReportRecord":true,"TLSResult":true,"Transport":true,"TransportDirect":true,"TransportSMTP":true,"TransportSocks":true,"URI":true,"WebForward":true,"WebHandler":true,"WebRedirect":true,"WebStatic":true,"WebserverConfig":true}
|
||||||
export const stringsTypes: {[typename: string]: boolean} = {"Align":true,"Alignment":true,"CSRFToken":true,"DKIMResult":true,"DMARCPolicy":true,"DMARCResult":true,"Disposition":true,"IP":true,"Localpart":true,"Mode":true,"PolicyOverride":true,"PolicyType":true,"RUA":true,"ResultType":true,"SPFDomainScope":true,"SPFResult":true}
|
export const stringsTypes: {[typename: string]: boolean} = {"Align":true,"Alignment":true,"CSRFToken":true,"DKIMResult":true,"DMARCPolicy":true,"DMARCResult":true,"Disposition":true,"IP":true,"Localpart":true,"Mode":true,"PolicyOverride":true,"PolicyType":true,"RUA":true,"ResultType":true,"SPFDomainScope":true,"SPFResult":true}
|
||||||
export const intsTypes: {[typename: string]: boolean} = {}
|
export const intsTypes: {[typename: string]: boolean} = {}
|
||||||
export const types: TypenameMap = {
|
export const types: TypenameMap = {
|
||||||
|
@ -918,7 +1112,9 @@ export const types: TypenameMap = {
|
||||||
"AutoconfCheckResult": {"Name":"AutoconfCheckResult","Docs":"","Fields":[{"Name":"ClientSettingsDomainIPs","Docs":"","Typewords":["[]","string"]},{"Name":"IPs","Docs":"","Typewords":["[]","string"]},{"Name":"Errors","Docs":"","Typewords":["[]","string"]},{"Name":"Warnings","Docs":"","Typewords":["[]","string"]},{"Name":"Instructions","Docs":"","Typewords":["[]","string"]}]},
|
"AutoconfCheckResult": {"Name":"AutoconfCheckResult","Docs":"","Fields":[{"Name":"ClientSettingsDomainIPs","Docs":"","Typewords":["[]","string"]},{"Name":"IPs","Docs":"","Typewords":["[]","string"]},{"Name":"Errors","Docs":"","Typewords":["[]","string"]},{"Name":"Warnings","Docs":"","Typewords":["[]","string"]},{"Name":"Instructions","Docs":"","Typewords":["[]","string"]}]},
|
||||||
"AutodiscoverCheckResult": {"Name":"AutodiscoverCheckResult","Docs":"","Fields":[{"Name":"Records","Docs":"","Typewords":["[]","AutodiscoverSRV"]},{"Name":"Errors","Docs":"","Typewords":["[]","string"]},{"Name":"Warnings","Docs":"","Typewords":["[]","string"]},{"Name":"Instructions","Docs":"","Typewords":["[]","string"]}]},
|
"AutodiscoverCheckResult": {"Name":"AutodiscoverCheckResult","Docs":"","Fields":[{"Name":"Records","Docs":"","Typewords":["[]","AutodiscoverSRV"]},{"Name":"Errors","Docs":"","Typewords":["[]","string"]},{"Name":"Warnings","Docs":"","Typewords":["[]","string"]},{"Name":"Instructions","Docs":"","Typewords":["[]","string"]}]},
|
||||||
"AutodiscoverSRV": {"Name":"AutodiscoverSRV","Docs":"","Fields":[{"Name":"Target","Docs":"","Typewords":["string"]},{"Name":"Port","Docs":"","Typewords":["uint16"]},{"Name":"Priority","Docs":"","Typewords":["uint16"]},{"Name":"Weight","Docs":"","Typewords":["uint16"]},{"Name":"IPs","Docs":"","Typewords":["[]","string"]}]},
|
"AutodiscoverSRV": {"Name":"AutodiscoverSRV","Docs":"","Fields":[{"Name":"Target","Docs":"","Typewords":["string"]},{"Name":"Port","Docs":"","Typewords":["uint16"]},{"Name":"Priority","Docs":"","Typewords":["uint16"]},{"Name":"Weight","Docs":"","Typewords":["uint16"]},{"Name":"IPs","Docs":"","Typewords":["[]","string"]}]},
|
||||||
"Account": {"Name":"Account","Docs":"","Fields":[{"Name":"Domain","Docs":"","Typewords":["string"]},{"Name":"Description","Docs":"","Typewords":["string"]},{"Name":"FullName","Docs":"","Typewords":["string"]},{"Name":"Destinations","Docs":"","Typewords":["{}","Destination"]},{"Name":"SubjectPass","Docs":"","Typewords":["SubjectPass"]},{"Name":"QuotaMessageSize","Docs":"","Typewords":["int64"]},{"Name":"RejectsMailbox","Docs":"","Typewords":["string"]},{"Name":"KeepRejects","Docs":"","Typewords":["bool"]},{"Name":"AutomaticJunkFlags","Docs":"","Typewords":["AutomaticJunkFlags"]},{"Name":"JunkFilter","Docs":"","Typewords":["nullable","JunkFilter"]},{"Name":"MaxOutgoingMessagesPerDay","Docs":"","Typewords":["int32"]},{"Name":"MaxFirstTimeRecipientsPerDay","Docs":"","Typewords":["int32"]},{"Name":"NoFirstTimeSenderDelay","Docs":"","Typewords":["bool"]},{"Name":"Routes","Docs":"","Typewords":["[]","Route"]},{"Name":"DNSDomain","Docs":"","Typewords":["Domain"]}]},
|
"Account": {"Name":"Account","Docs":"","Fields":[{"Name":"OutgoingWebhook","Docs":"","Typewords":["nullable","OutgoingWebhook"]},{"Name":"IncomingWebhook","Docs":"","Typewords":["nullable","IncomingWebhook"]},{"Name":"FromIDLoginAddresses","Docs":"","Typewords":["[]","string"]},{"Name":"KeepRetiredMessagePeriod","Docs":"","Typewords":["int64"]},{"Name":"KeepRetiredWebhookPeriod","Docs":"","Typewords":["int64"]},{"Name":"Domain","Docs":"","Typewords":["string"]},{"Name":"Description","Docs":"","Typewords":["string"]},{"Name":"FullName","Docs":"","Typewords":["string"]},{"Name":"Destinations","Docs":"","Typewords":["{}","Destination"]},{"Name":"SubjectPass","Docs":"","Typewords":["SubjectPass"]},{"Name":"QuotaMessageSize","Docs":"","Typewords":["int64"]},{"Name":"RejectsMailbox","Docs":"","Typewords":["string"]},{"Name":"KeepRejects","Docs":"","Typewords":["bool"]},{"Name":"AutomaticJunkFlags","Docs":"","Typewords":["AutomaticJunkFlags"]},{"Name":"JunkFilter","Docs":"","Typewords":["nullable","JunkFilter"]},{"Name":"MaxOutgoingMessagesPerDay","Docs":"","Typewords":["int32"]},{"Name":"MaxFirstTimeRecipientsPerDay","Docs":"","Typewords":["int32"]},{"Name":"NoFirstTimeSenderDelay","Docs":"","Typewords":["bool"]},{"Name":"Routes","Docs":"","Typewords":["[]","Route"]},{"Name":"DNSDomain","Docs":"","Typewords":["Domain"]}]},
|
||||||
|
"OutgoingWebhook": {"Name":"OutgoingWebhook","Docs":"","Fields":[{"Name":"URL","Docs":"","Typewords":["string"]},{"Name":"Authorization","Docs":"","Typewords":["string"]},{"Name":"Events","Docs":"","Typewords":["[]","string"]}]},
|
||||||
|
"IncomingWebhook": {"Name":"IncomingWebhook","Docs":"","Fields":[{"Name":"URL","Docs":"","Typewords":["string"]},{"Name":"Authorization","Docs":"","Typewords":["string"]}]},
|
||||||
"Destination": {"Name":"Destination","Docs":"","Fields":[{"Name":"Mailbox","Docs":"","Typewords":["string"]},{"Name":"Rulesets","Docs":"","Typewords":["[]","Ruleset"]},{"Name":"FullName","Docs":"","Typewords":["string"]}]},
|
"Destination": {"Name":"Destination","Docs":"","Fields":[{"Name":"Mailbox","Docs":"","Typewords":["string"]},{"Name":"Rulesets","Docs":"","Typewords":["[]","Ruleset"]},{"Name":"FullName","Docs":"","Typewords":["string"]}]},
|
||||||
"Ruleset": {"Name":"Ruleset","Docs":"","Fields":[{"Name":"SMTPMailFromRegexp","Docs":"","Typewords":["string"]},{"Name":"VerifiedDomain","Docs":"","Typewords":["string"]},{"Name":"HeadersRegexp","Docs":"","Typewords":["{}","string"]},{"Name":"IsForward","Docs":"","Typewords":["bool"]},{"Name":"ListAllowDomain","Docs":"","Typewords":["string"]},{"Name":"AcceptRejectsToMailbox","Docs":"","Typewords":["string"]},{"Name":"Mailbox","Docs":"","Typewords":["string"]},{"Name":"VerifiedDNSDomain","Docs":"","Typewords":["Domain"]},{"Name":"ListAllowDNSDomain","Docs":"","Typewords":["Domain"]}]},
|
"Ruleset": {"Name":"Ruleset","Docs":"","Fields":[{"Name":"SMTPMailFromRegexp","Docs":"","Typewords":["string"]},{"Name":"VerifiedDomain","Docs":"","Typewords":["string"]},{"Name":"HeadersRegexp","Docs":"","Typewords":["{}","string"]},{"Name":"IsForward","Docs":"","Typewords":["bool"]},{"Name":"ListAllowDomain","Docs":"","Typewords":["string"]},{"Name":"AcceptRejectsToMailbox","Docs":"","Typewords":["string"]},{"Name":"Mailbox","Docs":"","Typewords":["string"]},{"Name":"VerifiedDNSDomain","Docs":"","Typewords":["Domain"]},{"Name":"ListAllowDNSDomain","Docs":"","Typewords":["Domain"]}]},
|
||||||
"SubjectPass": {"Name":"SubjectPass","Docs":"","Fields":[{"Name":"Period","Docs":"","Typewords":["int64"]}]},
|
"SubjectPass": {"Name":"SubjectPass","Docs":"","Fields":[{"Name":"Period","Docs":"","Typewords":["int64"]}]},
|
||||||
|
@ -951,9 +1147,21 @@ export const types: TypenameMap = {
|
||||||
"ClientConfigs": {"Name":"ClientConfigs","Docs":"","Fields":[{"Name":"Entries","Docs":"","Typewords":["[]","ClientConfigsEntry"]}]},
|
"ClientConfigs": {"Name":"ClientConfigs","Docs":"","Fields":[{"Name":"Entries","Docs":"","Typewords":["[]","ClientConfigsEntry"]}]},
|
||||||
"ClientConfigsEntry": {"Name":"ClientConfigsEntry","Docs":"","Fields":[{"Name":"Protocol","Docs":"","Typewords":["string"]},{"Name":"Host","Docs":"","Typewords":["Domain"]},{"Name":"Port","Docs":"","Typewords":["int32"]},{"Name":"Listener","Docs":"","Typewords":["string"]},{"Name":"Note","Docs":"","Typewords":["string"]}]},
|
"ClientConfigsEntry": {"Name":"ClientConfigsEntry","Docs":"","Fields":[{"Name":"Protocol","Docs":"","Typewords":["string"]},{"Name":"Host","Docs":"","Typewords":["Domain"]},{"Name":"Port","Docs":"","Typewords":["int32"]},{"Name":"Listener","Docs":"","Typewords":["string"]},{"Name":"Note","Docs":"","Typewords":["string"]}]},
|
||||||
"HoldRule": {"Name":"HoldRule","Docs":"","Fields":[{"Name":"ID","Docs":"","Typewords":["int64"]},{"Name":"Account","Docs":"","Typewords":["string"]},{"Name":"SenderDomain","Docs":"","Typewords":["Domain"]},{"Name":"RecipientDomain","Docs":"","Typewords":["Domain"]},{"Name":"SenderDomainStr","Docs":"","Typewords":["string"]},{"Name":"RecipientDomainStr","Docs":"","Typewords":["string"]}]},
|
"HoldRule": {"Name":"HoldRule","Docs":"","Fields":[{"Name":"ID","Docs":"","Typewords":["int64"]},{"Name":"Account","Docs":"","Typewords":["string"]},{"Name":"SenderDomain","Docs":"","Typewords":["Domain"]},{"Name":"RecipientDomain","Docs":"","Typewords":["Domain"]},{"Name":"SenderDomainStr","Docs":"","Typewords":["string"]},{"Name":"RecipientDomainStr","Docs":"","Typewords":["string"]}]},
|
||||||
"Filter": {"Name":"Filter","Docs":"","Fields":[{"Name":"IDs","Docs":"","Typewords":["[]","int64"]},{"Name":"Account","Docs":"","Typewords":["string"]},{"Name":"From","Docs":"","Typewords":["string"]},{"Name":"To","Docs":"","Typewords":["string"]},{"Name":"Hold","Docs":"","Typewords":["nullable","bool"]},{"Name":"Submitted","Docs":"","Typewords":["string"]},{"Name":"NextAttempt","Docs":"","Typewords":["string"]},{"Name":"Transport","Docs":"","Typewords":["nullable","string"]}]},
|
"Filter": {"Name":"Filter","Docs":"","Fields":[{"Name":"Max","Docs":"","Typewords":["int32"]},{"Name":"IDs","Docs":"","Typewords":["[]","int64"]},{"Name":"Account","Docs":"","Typewords":["string"]},{"Name":"From","Docs":"","Typewords":["string"]},{"Name":"To","Docs":"","Typewords":["string"]},{"Name":"Hold","Docs":"","Typewords":["nullable","bool"]},{"Name":"Submitted","Docs":"","Typewords":["string"]},{"Name":"NextAttempt","Docs":"","Typewords":["string"]},{"Name":"Transport","Docs":"","Typewords":["nullable","string"]}]},
|
||||||
"Msg": {"Name":"Msg","Docs":"","Fields":[{"Name":"ID","Docs":"","Typewords":["int64"]},{"Name":"BaseID","Docs":"","Typewords":["int64"]},{"Name":"Queued","Docs":"","Typewords":["timestamp"]},{"Name":"Hold","Docs":"","Typewords":["bool"]},{"Name":"SenderAccount","Docs":"","Typewords":["string"]},{"Name":"SenderLocalpart","Docs":"","Typewords":["Localpart"]},{"Name":"SenderDomain","Docs":"","Typewords":["IPDomain"]},{"Name":"SenderDomainStr","Docs":"","Typewords":["string"]},{"Name":"RecipientLocalpart","Docs":"","Typewords":["Localpart"]},{"Name":"RecipientDomain","Docs":"","Typewords":["IPDomain"]},{"Name":"RecipientDomainStr","Docs":"","Typewords":["string"]},{"Name":"Attempts","Docs":"","Typewords":["int32"]},{"Name":"MaxAttempts","Docs":"","Typewords":["int32"]},{"Name":"DialedIPs","Docs":"","Typewords":["{}","[]","IP"]},{"Name":"NextAttempt","Docs":"","Typewords":["timestamp"]},{"Name":"LastAttempt","Docs":"","Typewords":["nullable","timestamp"]},{"Name":"LastError","Docs":"","Typewords":["string"]},{"Name":"Has8bit","Docs":"","Typewords":["bool"]},{"Name":"SMTPUTF8","Docs":"","Typewords":["bool"]},{"Name":"IsDMARCReport","Docs":"","Typewords":["bool"]},{"Name":"IsTLSReport","Docs":"","Typewords":["bool"]},{"Name":"Size","Docs":"","Typewords":["int64"]},{"Name":"MessageID","Docs":"","Typewords":["string"]},{"Name":"MsgPrefix","Docs":"","Typewords":["nullable","string"]},{"Name":"DSNUTF8","Docs":"","Typewords":["nullable","string"]},{"Name":"Transport","Docs":"","Typewords":["string"]},{"Name":"RequireTLS","Docs":"","Typewords":["nullable","bool"]},{"Name":"FutureReleaseRequest","Docs":"","Typewords":["string"]}]},
|
"Sort": {"Name":"Sort","Docs":"","Fields":[{"Name":"Field","Docs":"","Typewords":["string"]},{"Name":"LastID","Docs":"","Typewords":["int64"]},{"Name":"Last","Docs":"","Typewords":["any"]},{"Name":"Asc","Docs":"","Typewords":["bool"]}]},
|
||||||
|
"Msg": {"Name":"Msg","Docs":"","Fields":[{"Name":"ID","Docs":"","Typewords":["int64"]},{"Name":"BaseID","Docs":"","Typewords":["int64"]},{"Name":"Queued","Docs":"","Typewords":["timestamp"]},{"Name":"Hold","Docs":"","Typewords":["bool"]},{"Name":"SenderAccount","Docs":"","Typewords":["string"]},{"Name":"SenderLocalpart","Docs":"","Typewords":["Localpart"]},{"Name":"SenderDomain","Docs":"","Typewords":["IPDomain"]},{"Name":"SenderDomainStr","Docs":"","Typewords":["string"]},{"Name":"FromID","Docs":"","Typewords":["string"]},{"Name":"RecipientLocalpart","Docs":"","Typewords":["Localpart"]},{"Name":"RecipientDomain","Docs":"","Typewords":["IPDomain"]},{"Name":"RecipientDomainStr","Docs":"","Typewords":["string"]},{"Name":"Attempts","Docs":"","Typewords":["int32"]},{"Name":"MaxAttempts","Docs":"","Typewords":["int32"]},{"Name":"DialedIPs","Docs":"","Typewords":["{}","[]","IP"]},{"Name":"NextAttempt","Docs":"","Typewords":["timestamp"]},{"Name":"LastAttempt","Docs":"","Typewords":["nullable","timestamp"]},{"Name":"Results","Docs":"","Typewords":["[]","MsgResult"]},{"Name":"Has8bit","Docs":"","Typewords":["bool"]},{"Name":"SMTPUTF8","Docs":"","Typewords":["bool"]},{"Name":"IsDMARCReport","Docs":"","Typewords":["bool"]},{"Name":"IsTLSReport","Docs":"","Typewords":["bool"]},{"Name":"Size","Docs":"","Typewords":["int64"]},{"Name":"MessageID","Docs":"","Typewords":["string"]},{"Name":"MsgPrefix","Docs":"","Typewords":["nullable","string"]},{"Name":"Subject","Docs":"","Typewords":["string"]},{"Name":"DSNUTF8","Docs":"","Typewords":["nullable","string"]},{"Name":"Transport","Docs":"","Typewords":["string"]},{"Name":"RequireTLS","Docs":"","Typewords":["nullable","bool"]},{"Name":"FutureReleaseRequest","Docs":"","Typewords":["string"]},{"Name":"Extra","Docs":"","Typewords":["{}","string"]}]},
|
||||||
"IPDomain": {"Name":"IPDomain","Docs":"","Fields":[{"Name":"IP","Docs":"","Typewords":["IP"]},{"Name":"Domain","Docs":"","Typewords":["Domain"]}]},
|
"IPDomain": {"Name":"IPDomain","Docs":"","Fields":[{"Name":"IP","Docs":"","Typewords":["IP"]},{"Name":"Domain","Docs":"","Typewords":["Domain"]}]},
|
||||||
|
"MsgResult": {"Name":"MsgResult","Docs":"","Fields":[{"Name":"Start","Docs":"","Typewords":["timestamp"]},{"Name":"Duration","Docs":"","Typewords":["int64"]},{"Name":"Success","Docs":"","Typewords":["bool"]},{"Name":"Code","Docs":"","Typewords":["int32"]},{"Name":"Secode","Docs":"","Typewords":["string"]},{"Name":"Error","Docs":"","Typewords":["string"]}]},
|
||||||
|
"RetiredFilter": {"Name":"RetiredFilter","Docs":"","Fields":[{"Name":"Max","Docs":"","Typewords":["int32"]},{"Name":"IDs","Docs":"","Typewords":["[]","int64"]},{"Name":"Account","Docs":"","Typewords":["string"]},{"Name":"From","Docs":"","Typewords":["string"]},{"Name":"To","Docs":"","Typewords":["string"]},{"Name":"Submitted","Docs":"","Typewords":["string"]},{"Name":"LastActivity","Docs":"","Typewords":["string"]},{"Name":"Transport","Docs":"","Typewords":["nullable","string"]},{"Name":"Success","Docs":"","Typewords":["nullable","bool"]}]},
|
||||||
|
"RetiredSort": {"Name":"RetiredSort","Docs":"","Fields":[{"Name":"Field","Docs":"","Typewords":["string"]},{"Name":"LastID","Docs":"","Typewords":["int64"]},{"Name":"Last","Docs":"","Typewords":["any"]},{"Name":"Asc","Docs":"","Typewords":["bool"]}]},
|
||||||
|
"MsgRetired": {"Name":"MsgRetired","Docs":"","Fields":[{"Name":"ID","Docs":"","Typewords":["int64"]},{"Name":"BaseID","Docs":"","Typewords":["int64"]},{"Name":"Queued","Docs":"","Typewords":["timestamp"]},{"Name":"SenderAccount","Docs":"","Typewords":["string"]},{"Name":"SenderLocalpart","Docs":"","Typewords":["Localpart"]},{"Name":"SenderDomainStr","Docs":"","Typewords":["string"]},{"Name":"FromID","Docs":"","Typewords":["string"]},{"Name":"RecipientLocalpart","Docs":"","Typewords":["Localpart"]},{"Name":"RecipientDomain","Docs":"","Typewords":["IPDomain"]},{"Name":"RecipientDomainStr","Docs":"","Typewords":["string"]},{"Name":"Attempts","Docs":"","Typewords":["int32"]},{"Name":"MaxAttempts","Docs":"","Typewords":["int32"]},{"Name":"DialedIPs","Docs":"","Typewords":["{}","[]","IP"]},{"Name":"LastAttempt","Docs":"","Typewords":["nullable","timestamp"]},{"Name":"Results","Docs":"","Typewords":["[]","MsgResult"]},{"Name":"Has8bit","Docs":"","Typewords":["bool"]},{"Name":"SMTPUTF8","Docs":"","Typewords":["bool"]},{"Name":"IsDMARCReport","Docs":"","Typewords":["bool"]},{"Name":"IsTLSReport","Docs":"","Typewords":["bool"]},{"Name":"Size","Docs":"","Typewords":["int64"]},{"Name":"MessageID","Docs":"","Typewords":["string"]},{"Name":"Subject","Docs":"","Typewords":["string"]},{"Name":"Transport","Docs":"","Typewords":["string"]},{"Name":"RequireTLS","Docs":"","Typewords":["nullable","bool"]},{"Name":"FutureReleaseRequest","Docs":"","Typewords":["string"]},{"Name":"Extra","Docs":"","Typewords":["{}","string"]},{"Name":"LastActivity","Docs":"","Typewords":["timestamp"]},{"Name":"RecipientAddress","Docs":"","Typewords":["string"]},{"Name":"Success","Docs":"","Typewords":["bool"]},{"Name":"KeepUntil","Docs":"","Typewords":["timestamp"]}]},
|
||||||
|
"HookFilter": {"Name":"HookFilter","Docs":"","Fields":[{"Name":"Max","Docs":"","Typewords":["int32"]},{"Name":"IDs","Docs":"","Typewords":["[]","int64"]},{"Name":"Account","Docs":"","Typewords":["string"]},{"Name":"Submitted","Docs":"","Typewords":["string"]},{"Name":"NextAttempt","Docs":"","Typewords":["string"]},{"Name":"Event","Docs":"","Typewords":["string"]}]},
|
||||||
|
"HookSort": {"Name":"HookSort","Docs":"","Fields":[{"Name":"Field","Docs":"","Typewords":["string"]},{"Name":"LastID","Docs":"","Typewords":["int64"]},{"Name":"Last","Docs":"","Typewords":["any"]},{"Name":"Asc","Docs":"","Typewords":["bool"]}]},
|
||||||
|
"Hook": {"Name":"Hook","Docs":"","Fields":[{"Name":"ID","Docs":"","Typewords":["int64"]},{"Name":"QueueMsgID","Docs":"","Typewords":["int64"]},{"Name":"FromID","Docs":"","Typewords":["string"]},{"Name":"MessageID","Docs":"","Typewords":["string"]},{"Name":"Subject","Docs":"","Typewords":["string"]},{"Name":"Extra","Docs":"","Typewords":["{}","string"]},{"Name":"Account","Docs":"","Typewords":["string"]},{"Name":"URL","Docs":"","Typewords":["string"]},{"Name":"Authorization","Docs":"","Typewords":["string"]},{"Name":"IsIncoming","Docs":"","Typewords":["bool"]},{"Name":"OutgoingEvent","Docs":"","Typewords":["string"]},{"Name":"Payload","Docs":"","Typewords":["string"]},{"Name":"Submitted","Docs":"","Typewords":["timestamp"]},{"Name":"Attempts","Docs":"","Typewords":["int32"]},{"Name":"NextAttempt","Docs":"","Typewords":["timestamp"]},{"Name":"Results","Docs":"","Typewords":["[]","HookResult"]}]},
|
||||||
|
"HookResult": {"Name":"HookResult","Docs":"","Fields":[{"Name":"Start","Docs":"","Typewords":["timestamp"]},{"Name":"Duration","Docs":"","Typewords":["int64"]},{"Name":"URL","Docs":"","Typewords":["string"]},{"Name":"Success","Docs":"","Typewords":["bool"]},{"Name":"Code","Docs":"","Typewords":["int32"]},{"Name":"Error","Docs":"","Typewords":["string"]},{"Name":"Response","Docs":"","Typewords":["string"]}]},
|
||||||
|
"HookRetiredFilter": {"Name":"HookRetiredFilter","Docs":"","Fields":[{"Name":"Max","Docs":"","Typewords":["int32"]},{"Name":"IDs","Docs":"","Typewords":["[]","int64"]},{"Name":"Account","Docs":"","Typewords":["string"]},{"Name":"Submitted","Docs":"","Typewords":["string"]},{"Name":"LastActivity","Docs":"","Typewords":["string"]},{"Name":"Event","Docs":"","Typewords":["string"]}]},
|
||||||
|
"HookRetiredSort": {"Name":"HookRetiredSort","Docs":"","Fields":[{"Name":"Field","Docs":"","Typewords":["string"]},{"Name":"LastID","Docs":"","Typewords":["int64"]},{"Name":"Last","Docs":"","Typewords":["any"]},{"Name":"Asc","Docs":"","Typewords":["bool"]}]},
|
||||||
|
"HookRetired": {"Name":"HookRetired","Docs":"","Fields":[{"Name":"ID","Docs":"","Typewords":["int64"]},{"Name":"QueueMsgID","Docs":"","Typewords":["int64"]},{"Name":"FromID","Docs":"","Typewords":["string"]},{"Name":"MessageID","Docs":"","Typewords":["string"]},{"Name":"Subject","Docs":"","Typewords":["string"]},{"Name":"Extra","Docs":"","Typewords":["{}","string"]},{"Name":"Account","Docs":"","Typewords":["string"]},{"Name":"URL","Docs":"","Typewords":["string"]},{"Name":"Authorization","Docs":"","Typewords":["bool"]},{"Name":"IsIncoming","Docs":"","Typewords":["bool"]},{"Name":"OutgoingEvent","Docs":"","Typewords":["string"]},{"Name":"Payload","Docs":"","Typewords":["string"]},{"Name":"Submitted","Docs":"","Typewords":["timestamp"]},{"Name":"SupersededByID","Docs":"","Typewords":["int64"]},{"Name":"Attempts","Docs":"","Typewords":["int32"]},{"Name":"Results","Docs":"","Typewords":["[]","HookResult"]},{"Name":"Success","Docs":"","Typewords":["bool"]},{"Name":"LastActivity","Docs":"","Typewords":["timestamp"]},{"Name":"KeepUntil","Docs":"","Typewords":["timestamp"]}]},
|
||||||
"WebserverConfig": {"Name":"WebserverConfig","Docs":"","Fields":[{"Name":"WebDNSDomainRedirects","Docs":"","Typewords":["[]","[]","Domain"]},{"Name":"WebDomainRedirects","Docs":"","Typewords":["[]","[]","string"]},{"Name":"WebHandlers","Docs":"","Typewords":["[]","WebHandler"]}]},
|
"WebserverConfig": {"Name":"WebserverConfig","Docs":"","Fields":[{"Name":"WebDNSDomainRedirects","Docs":"","Typewords":["[]","[]","Domain"]},{"Name":"WebDomainRedirects","Docs":"","Typewords":["[]","[]","string"]},{"Name":"WebHandlers","Docs":"","Typewords":["[]","WebHandler"]}]},
|
||||||
"WebHandler": {"Name":"WebHandler","Docs":"","Fields":[{"Name":"LogName","Docs":"","Typewords":["string"]},{"Name":"Domain","Docs":"","Typewords":["string"]},{"Name":"PathRegexp","Docs":"","Typewords":["string"]},{"Name":"DontRedirectPlainHTTP","Docs":"","Typewords":["bool"]},{"Name":"Compress","Docs":"","Typewords":["bool"]},{"Name":"WebStatic","Docs":"","Typewords":["nullable","WebStatic"]},{"Name":"WebRedirect","Docs":"","Typewords":["nullable","WebRedirect"]},{"Name":"WebForward","Docs":"","Typewords":["nullable","WebForward"]},{"Name":"Name","Docs":"","Typewords":["string"]},{"Name":"DNSDomain","Docs":"","Typewords":["Domain"]}]},
|
"WebHandler": {"Name":"WebHandler","Docs":"","Fields":[{"Name":"LogName","Docs":"","Typewords":["string"]},{"Name":"Domain","Docs":"","Typewords":["string"]},{"Name":"PathRegexp","Docs":"","Typewords":["string"]},{"Name":"DontRedirectPlainHTTP","Docs":"","Typewords":["bool"]},{"Name":"Compress","Docs":"","Typewords":["bool"]},{"Name":"WebStatic","Docs":"","Typewords":["nullable","WebStatic"]},{"Name":"WebRedirect","Docs":"","Typewords":["nullable","WebRedirect"]},{"Name":"WebForward","Docs":"","Typewords":["nullable","WebForward"]},{"Name":"Name","Docs":"","Typewords":["string"]},{"Name":"DNSDomain","Docs":"","Typewords":["Domain"]}]},
|
||||||
"WebStatic": {"Name":"WebStatic","Docs":"","Fields":[{"Name":"StripPrefix","Docs":"","Typewords":["string"]},{"Name":"Root","Docs":"","Typewords":["string"]},{"Name":"ListFiles","Docs":"","Typewords":["bool"]},{"Name":"ContinueNotFound","Docs":"","Typewords":["bool"]},{"Name":"ResponseHeaders","Docs":"","Typewords":["{}","string"]}]},
|
"WebStatic": {"Name":"WebStatic","Docs":"","Fields":[{"Name":"StripPrefix","Docs":"","Typewords":["string"]},{"Name":"Root","Docs":"","Typewords":["string"]},{"Name":"ListFiles","Docs":"","Typewords":["bool"]},{"Name":"ContinueNotFound","Docs":"","Typewords":["bool"]},{"Name":"ResponseHeaders","Docs":"","Typewords":["{}","string"]}]},
|
||||||
|
@ -1020,6 +1228,8 @@ export const parser = {
|
||||||
AutodiscoverCheckResult: (v: any) => parse("AutodiscoverCheckResult", v) as AutodiscoverCheckResult,
|
AutodiscoverCheckResult: (v: any) => parse("AutodiscoverCheckResult", v) as AutodiscoverCheckResult,
|
||||||
AutodiscoverSRV: (v: any) => parse("AutodiscoverSRV", v) as AutodiscoverSRV,
|
AutodiscoverSRV: (v: any) => parse("AutodiscoverSRV", v) as AutodiscoverSRV,
|
||||||
Account: (v: any) => parse("Account", v) as Account,
|
Account: (v: any) => parse("Account", v) as Account,
|
||||||
|
OutgoingWebhook: (v: any) => parse("OutgoingWebhook", v) as OutgoingWebhook,
|
||||||
|
IncomingWebhook: (v: any) => parse("IncomingWebhook", v) as IncomingWebhook,
|
||||||
Destination: (v: any) => parse("Destination", v) as Destination,
|
Destination: (v: any) => parse("Destination", v) as Destination,
|
||||||
Ruleset: (v: any) => parse("Ruleset", v) as Ruleset,
|
Ruleset: (v: any) => parse("Ruleset", v) as Ruleset,
|
||||||
SubjectPass: (v: any) => parse("SubjectPass", v) as SubjectPass,
|
SubjectPass: (v: any) => parse("SubjectPass", v) as SubjectPass,
|
||||||
|
@ -1053,8 +1263,20 @@ export const parser = {
|
||||||
ClientConfigsEntry: (v: any) => parse("ClientConfigsEntry", v) as ClientConfigsEntry,
|
ClientConfigsEntry: (v: any) => parse("ClientConfigsEntry", v) as ClientConfigsEntry,
|
||||||
HoldRule: (v: any) => parse("HoldRule", v) as HoldRule,
|
HoldRule: (v: any) => parse("HoldRule", v) as HoldRule,
|
||||||
Filter: (v: any) => parse("Filter", v) as Filter,
|
Filter: (v: any) => parse("Filter", v) as Filter,
|
||||||
|
Sort: (v: any) => parse("Sort", v) as Sort,
|
||||||
Msg: (v: any) => parse("Msg", v) as Msg,
|
Msg: (v: any) => parse("Msg", v) as Msg,
|
||||||
IPDomain: (v: any) => parse("IPDomain", v) as IPDomain,
|
IPDomain: (v: any) => parse("IPDomain", v) as IPDomain,
|
||||||
|
MsgResult: (v: any) => parse("MsgResult", v) as MsgResult,
|
||||||
|
RetiredFilter: (v: any) => parse("RetiredFilter", v) as RetiredFilter,
|
||||||
|
RetiredSort: (v: any) => parse("RetiredSort", v) as RetiredSort,
|
||||||
|
MsgRetired: (v: any) => parse("MsgRetired", v) as MsgRetired,
|
||||||
|
HookFilter: (v: any) => parse("HookFilter", v) as HookFilter,
|
||||||
|
HookSort: (v: any) => parse("HookSort", v) as HookSort,
|
||||||
|
Hook: (v: any) => parse("Hook", v) as Hook,
|
||||||
|
HookResult: (v: any) => parse("HookResult", v) as HookResult,
|
||||||
|
HookRetiredFilter: (v: any) => parse("HookRetiredFilter", v) as HookRetiredFilter,
|
||||||
|
HookRetiredSort: (v: any) => parse("HookRetiredSort", v) as HookRetiredSort,
|
||||||
|
HookRetired: (v: any) => parse("HookRetired", v) as HookRetired,
|
||||||
WebserverConfig: (v: any) => parse("WebserverConfig", v) as WebserverConfig,
|
WebserverConfig: (v: any) => parse("WebserverConfig", v) as WebserverConfig,
|
||||||
WebHandler: (v: any) => parse("WebHandler", v) as WebHandler,
|
WebHandler: (v: any) => parse("WebHandler", v) as WebHandler,
|
||||||
WebStatic: (v: any) => parse("WebStatic", v) as WebStatic,
|
WebStatic: (v: any) => parse("WebStatic", v) as WebStatic,
|
||||||
|
@ -1457,11 +1679,11 @@ export class Client {
|
||||||
}
|
}
|
||||||
|
|
||||||
// QueueList returns the messages currently in the outgoing queue.
|
// QueueList returns the messages currently in the outgoing queue.
|
||||||
async QueueList(filter: Filter): Promise<Msg[] | null> {
|
async QueueList(filter: Filter, sort: Sort): Promise<Msg[] | null> {
|
||||||
const fn: string = "QueueList"
|
const fn: string = "QueueList"
|
||||||
const paramTypes: string[][] = [["Filter"]]
|
const paramTypes: string[][] = [["Filter"],["Sort"]]
|
||||||
const returnTypes: string[][] = [["[]","Msg"]]
|
const returnTypes: string[][] = [["[]","Msg"]]
|
||||||
const params: any[] = [filter]
|
const params: any[] = [filter, sort]
|
||||||
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as Msg[] | null
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as Msg[] | null
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -1532,6 +1754,72 @@ export class Client {
|
||||||
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as number
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as number
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// RetiredList returns messages retired from the queue (delivery could
|
||||||
|
// have succeeded or failed).
|
||||||
|
async RetiredList(filter: RetiredFilter, sort: RetiredSort): Promise<MsgRetired[] | null> {
|
||||||
|
const fn: string = "RetiredList"
|
||||||
|
const paramTypes: string[][] = [["RetiredFilter"],["RetiredSort"]]
|
||||||
|
const returnTypes: string[][] = [["[]","MsgRetired"]]
|
||||||
|
const params: any[] = [filter, sort]
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as MsgRetired[] | null
|
||||||
|
}
|
||||||
|
|
||||||
|
// HookQueueSize returns the number of webhooks still to be delivered.
|
||||||
|
async HookQueueSize(): Promise<number> {
|
||||||
|
const fn: string = "HookQueueSize"
|
||||||
|
const paramTypes: string[][] = []
|
||||||
|
const returnTypes: string[][] = [["int32"]]
|
||||||
|
const params: any[] = []
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as number
|
||||||
|
}
|
||||||
|
|
||||||
|
// HookList lists webhooks still to be delivered.
|
||||||
|
async HookList(filter: HookFilter, sort: HookSort): Promise<Hook[] | null> {
|
||||||
|
const fn: string = "HookList"
|
||||||
|
const paramTypes: string[][] = [["HookFilter"],["HookSort"]]
|
||||||
|
const returnTypes: string[][] = [["[]","Hook"]]
|
||||||
|
const params: any[] = [filter, sort]
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as Hook[] | null
|
||||||
|
}
|
||||||
|
|
||||||
|
// HookNextAttemptSet sets a new time for next delivery attempt of matching
|
||||||
|
// hooks from the queue.
|
||||||
|
async HookNextAttemptSet(filter: HookFilter, minutes: number): Promise<number> {
|
||||||
|
const fn: string = "HookNextAttemptSet"
|
||||||
|
const paramTypes: string[][] = [["HookFilter"],["int32"]]
|
||||||
|
const returnTypes: string[][] = [["int32"]]
|
||||||
|
const params: any[] = [filter, minutes]
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as number
|
||||||
|
}
|
||||||
|
|
||||||
|
// HookNextAttemptAdd adds a duration to the time of next delivery attempt of
|
||||||
|
// matching hooks from the queue.
|
||||||
|
async HookNextAttemptAdd(filter: HookFilter, minutes: number): Promise<number> {
|
||||||
|
const fn: string = "HookNextAttemptAdd"
|
||||||
|
const paramTypes: string[][] = [["HookFilter"],["int32"]]
|
||||||
|
const returnTypes: string[][] = [["int32"]]
|
||||||
|
const params: any[] = [filter, minutes]
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as number
|
||||||
|
}
|
||||||
|
|
||||||
|
// HookRetiredList lists retired webhooks.
|
||||||
|
async HookRetiredList(filter: HookRetiredFilter, sort: HookRetiredSort): Promise<HookRetired[] | null> {
|
||||||
|
const fn: string = "HookRetiredList"
|
||||||
|
const paramTypes: string[][] = [["HookRetiredFilter"],["HookRetiredSort"]]
|
||||||
|
const returnTypes: string[][] = [["[]","HookRetired"]]
|
||||||
|
const params: any[] = [filter, sort]
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as HookRetired[] | null
|
||||||
|
}
|
||||||
|
|
||||||
|
// HookCancel prevents further delivery attempts of matching webhooks.
|
||||||
|
async HookCancel(filter: HookFilter): Promise<number> {
|
||||||
|
const fn: string = "HookCancel"
|
||||||
|
const paramTypes: string[][] = [["HookFilter"]]
|
||||||
|
const returnTypes: string[][] = [["int32"]]
|
||||||
|
const params: any[] = [filter]
|
||||||
|
return await _sherpaCall(this.baseURL, this.authState, { ...this.options }, paramTypes, returnTypes, fn, params) as number
|
||||||
|
}
|
||||||
|
|
||||||
// LogLevels returns the current log levels.
|
// LogLevels returns the current log levels.
|
||||||
async LogLevels(): Promise<{ [key: string]: string }> {
|
async LogLevels(): Promise<{ [key: string]: string }> {
|
||||||
const fn: string = "LogLevels"
|
const fn: string = "LogLevels"
|
||||||
|
|
244
webapi/client.go
Normal file
244
webapi/client.go
Normal file
|
@ -0,0 +1,244 @@
|
||||||
|
package webapi
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"net/http"
|
||||||
|
"net/url"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Client can be used to call webapi methods.
|
||||||
|
// Client implements [Methods].
|
||||||
|
type Client struct {
|
||||||
|
BaseURL string // For example: http://localhost:1080/webapi/v0/.
|
||||||
|
Username string // Added as HTTP basic authentication if not empty.
|
||||||
|
Password string
|
||||||
|
HTTPClient *http.Client // Optional, defaults to http.DefaultClient.
|
||||||
|
}
|
||||||
|
|
||||||
|
var _ Methods = Client{}
|
||||||
|
|
||||||
|
func (c Client) httpClient() *http.Client {
|
||||||
|
if c.HTTPClient != nil {
|
||||||
|
return c.HTTPClient
|
||||||
|
}
|
||||||
|
return http.DefaultClient
|
||||||
|
}
|
||||||
|
|
||||||
|
func transact[T any](ctx context.Context, c Client, fn string, req any) (resp T, rerr error) {
|
||||||
|
hresp, err := httpDo(ctx, c, fn, req)
|
||||||
|
if err != nil {
|
||||||
|
return resp, err
|
||||||
|
}
|
||||||
|
defer hresp.Body.Close()
|
||||||
|
|
||||||
|
if hresp.StatusCode == http.StatusOK {
|
||||||
|
// Text and HTML of a message can each be 1MB. Another MB for other data would be a
|
||||||
|
// lot.
|
||||||
|
err := json.NewDecoder(&limitReader{hresp.Body, 3 * 1024 * 1024}).Decode(&resp)
|
||||||
|
return resp, err
|
||||||
|
}
|
||||||
|
return resp, badResponse(hresp)
|
||||||
|
}
|
||||||
|
|
||||||
|
func transactReadCloser(ctx context.Context, c Client, fn string, req any) (resp io.ReadCloser, rerr error) {
|
||||||
|
hresp, err := httpDo(ctx, c, fn, req)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
body := hresp.Body
|
||||||
|
defer func() {
|
||||||
|
if body != nil {
|
||||||
|
body.Close()
|
||||||
|
}
|
||||||
|
}()
|
||||||
|
if hresp.StatusCode == http.StatusOK {
|
||||||
|
r := body
|
||||||
|
body = nil
|
||||||
|
return r, nil
|
||||||
|
}
|
||||||
|
return nil, badResponse(hresp)
|
||||||
|
}
|
||||||
|
|
||||||
|
func httpDo(ctx context.Context, c Client, fn string, req any) (*http.Response, error) {
|
||||||
|
reqbuf, err := json.Marshal(req)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("marshal request: %v", err)
|
||||||
|
}
|
||||||
|
data := url.Values{}
|
||||||
|
data.Add("request", string(reqbuf))
|
||||||
|
hreq, err := http.NewRequestWithContext(ctx, "POST", c.BaseURL+fn, strings.NewReader(data.Encode()))
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("new request: %v", err)
|
||||||
|
}
|
||||||
|
hreq.Header.Set("Content-Type", "application/x-www-form-urlencoded")
|
||||||
|
if c.Username != "" {
|
||||||
|
hreq.SetBasicAuth(c.Username, c.Password)
|
||||||
|
}
|
||||||
|
hresp, err := c.httpClient().Do(hreq)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("http transaction: %v", err)
|
||||||
|
}
|
||||||
|
return hresp, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func badResponse(hresp *http.Response) error {
|
||||||
|
if hresp.StatusCode != http.StatusBadRequest {
|
||||||
|
return fmt.Errorf("http status %v, expected 200 ok", hresp.Status)
|
||||||
|
}
|
||||||
|
buf, err := io.ReadAll(&limitReader{R: hresp.Body, Limit: 10 * 1024})
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("reading error from remote: %v", err)
|
||||||
|
}
|
||||||
|
var xerr Error
|
||||||
|
err = json.Unmarshal(buf, &xerr)
|
||||||
|
if err != nil {
|
||||||
|
if len(buf) > 512 {
|
||||||
|
buf = buf[:512]
|
||||||
|
}
|
||||||
|
return fmt.Errorf("error parsing error from remote: %v (first 512 bytes of response: %s)", err, string(buf))
|
||||||
|
}
|
||||||
|
return xerr
|
||||||
|
}
|
||||||
|
|
||||||
|
// Send composes a message and submits it to the queue for delivery for all
|
||||||
|
// recipients (to, cc, bcc).
|
||||||
|
//
|
||||||
|
// Configure your account to use unique SMTP MAIL FROM addresses ("fromid") and to
|
||||||
|
// keep history of retired messages, for better handling of transactional email,
|
||||||
|
// automatically managing a suppression list.
|
||||||
|
//
|
||||||
|
// Configure webhooks to receive updates about deliveries.
|
||||||
|
//
|
||||||
|
// If the request is a multipart/form-data, uploaded files with the form keys
|
||||||
|
// "inlinefile" and/or "attachedfile" will be added to the message. If the uploaded
|
||||||
|
// file has content-type and/or content-id headers, they will be included. If no
|
||||||
|
// content-type is present in the request, and it can be detected, it is included
|
||||||
|
// automatically.
|
||||||
|
//
|
||||||
|
// Example call with a text and html message, with an inline and an attached image:
|
||||||
|
//
|
||||||
|
// curl --user mox@localhost:moxmoxmox \
|
||||||
|
// --form request='{"To": [{"Address": "mox@localhost"}], "Text": "hi ☺", "HTML": "<img src=\"cid:hi\" />"}' \
|
||||||
|
// --form 'inlinefile=@hi.png;headers="Content-ID: <hi>"' \
|
||||||
|
// --form attachedfile=@mox.png \
|
||||||
|
// http://localhost:1080/webapi/v0/Send
|
||||||
|
//
|
||||||
|
// Error codes:
|
||||||
|
//
|
||||||
|
// - badAddress, if an email address is invalid.
|
||||||
|
// - missingBody, if no text and no html body was specified.
|
||||||
|
// - multipleFrom, if multiple from addresses were specified.
|
||||||
|
// - badFrom, if a from address was specified that isn't configured for the account.
|
||||||
|
// - noRecipients, if no recipients were specified.
|
||||||
|
// - messageLimitReached, if the outgoing message rate limit was reached.
|
||||||
|
// - recipientLimitReached, if the outgoing new recipient rate limit was reached.
|
||||||
|
// - messageTooLarge, message larger than configured maximum size.
|
||||||
|
// - malformedMessageID, if MessageID is specified but invalid.
|
||||||
|
// - sentOverQuota, message submitted, but not stored in Sent mailbox due to quota reached.
|
||||||
|
func (c Client) Send(ctx context.Context, req SendRequest) (resp SendResult, err error) {
|
||||||
|
return transact[SendResult](ctx, c, "Send", req)
|
||||||
|
}
|
||||||
|
|
||||||
|
// SuppressionList returns the addresses on the per-account suppression list.
|
||||||
|
func (c Client) SuppressionList(ctx context.Context, req SuppressionListRequest) (resp SuppressionListResult, err error) {
|
||||||
|
return transact[SuppressionListResult](ctx, c, "SuppressionList", req)
|
||||||
|
}
|
||||||
|
|
||||||
|
// SuppressionAdd adds an address to the suppression list of the account.
|
||||||
|
//
|
||||||
|
// Error codes:
|
||||||
|
//
|
||||||
|
// - badAddress, if the email address is invalid.
|
||||||
|
func (c Client) SuppressionAdd(ctx context.Context, req SuppressionAddRequest) (resp SuppressionAddResult, err error) {
|
||||||
|
return transact[SuppressionAddResult](ctx, c, "SuppressionAdd", req)
|
||||||
|
}
|
||||||
|
|
||||||
|
// SuppressionRemove removes an address from the suppression list of the account.
|
||||||
|
//
|
||||||
|
// Error codes:
|
||||||
|
//
|
||||||
|
// - badAddress, if the email address is invalid.
|
||||||
|
func (c Client) SuppressionRemove(ctx context.Context, req SuppressionRemoveRequest) (resp SuppressionRemoveResult, err error) {
|
||||||
|
return transact[SuppressionRemoveResult](ctx, c, "SuppressionRemove", req)
|
||||||
|
}
|
||||||
|
|
||||||
|
// SuppressionPresent returns whether an address is present in the suppression list of the account.
|
||||||
|
//
|
||||||
|
// Error codes:
|
||||||
|
//
|
||||||
|
// - badAddress, if the email address is invalid.
|
||||||
|
func (c Client) SuppressionPresent(ctx context.Context, req SuppressionPresentRequest) (resp SuppressionPresentResult, err error) {
|
||||||
|
return transact[SuppressionPresentResult](ctx, c, "SuppressionPresent", req)
|
||||||
|
}
|
||||||
|
|
||||||
|
// MessageGet returns a message from the account storage in parsed form.
|
||||||
|
//
|
||||||
|
// Use [Client.MessageRawGet] for the raw message (internet message file).
|
||||||
|
//
|
||||||
|
// Error codes:
|
||||||
|
// - messageNotFound, if the message does not exist.
|
||||||
|
func (c Client) MessageGet(ctx context.Context, req MessageGetRequest) (resp MessageGetResult, err error) {
|
||||||
|
return transact[MessageGetResult](ctx, c, "MessageGet", req)
|
||||||
|
}
|
||||||
|
|
||||||
|
// MessageRawGet returns the full message in its original form, as stored on disk.
|
||||||
|
//
|
||||||
|
// Error codes:
|
||||||
|
// - messageNotFound, if the message does not exist.
|
||||||
|
func (c Client) MessageRawGet(ctx context.Context, req MessageRawGetRequest) (resp io.ReadCloser, err error) {
|
||||||
|
return transactReadCloser(ctx, c, "MessageRawGet", req)
|
||||||
|
}
|
||||||
|
|
||||||
|
// MessagePartGet returns a single part from a multipart message, by a "parts
|
||||||
|
// path", a series of indices into the multipart hierarchy as seen in the parsed
|
||||||
|
// message. The initial selection is the body of the outer message (excluding
|
||||||
|
// headers).
|
||||||
|
//
|
||||||
|
// Error codes:
|
||||||
|
// - messageNotFound, if the message does not exist.
|
||||||
|
// - partNotFound, if the part does not exist.
|
||||||
|
func (c Client) MessagePartGet(ctx context.Context, req MessagePartGetRequest) (resp io.ReadCloser, err error) {
|
||||||
|
return transactReadCloser(ctx, c, "MessagePartGet", req)
|
||||||
|
}
|
||||||
|
|
||||||
|
// MessageDelete permanently removes a message from the account storage (not moving
|
||||||
|
// to a Trash folder).
|
||||||
|
//
|
||||||
|
// Error codes:
|
||||||
|
// - messageNotFound, if the message does not exist.
|
||||||
|
func (c Client) MessageDelete(ctx context.Context, req MessageDeleteRequest) (resp MessageDeleteResult, err error) {
|
||||||
|
return transact[MessageDeleteResult](ctx, c, "MessageDelete", req)
|
||||||
|
}
|
||||||
|
|
||||||
|
// MessageFlagsAdd adds (sets) flags on a message, like the well-known flags
|
||||||
|
// beginning with a backslash like \seen, \answered, \draft, or well-known flags
|
||||||
|
// beginning with a dollar like $junk, $notjunk, $forwarded, or custom flags.
|
||||||
|
// Existing flags are left unchanged.
|
||||||
|
//
|
||||||
|
// Error codes:
|
||||||
|
// - messageNotFound, if the message does not exist.
|
||||||
|
func (c Client) MessageFlagsAdd(ctx context.Context, req MessageFlagsAddRequest) (resp MessageFlagsAddResult, err error) {
|
||||||
|
return transact[MessageFlagsAddResult](ctx, c, "MessageFlagsAdd", req)
|
||||||
|
}
|
||||||
|
|
||||||
|
// MessageFlagsRemove removes (clears) flags on a message.
|
||||||
|
// Other flags are left unchanged.
|
||||||
|
//
|
||||||
|
// Error codes:
|
||||||
|
// - messageNotFound, if the message does not exist.
|
||||||
|
func (c Client) MessageFlagsRemove(ctx context.Context, req MessageFlagsRemoveRequest) (resp MessageFlagsRemoveResult, err error) {
|
||||||
|
return transact[MessageFlagsRemoveResult](ctx, c, "MessageFlagsRemove", req)
|
||||||
|
}
|
||||||
|
|
||||||
|
// MessageMove moves a message to a new mailbox name (folder). The destination
|
||||||
|
// mailbox name must already exist.
|
||||||
|
//
|
||||||
|
// Error codes:
|
||||||
|
// - messageNotFound, if the message does not exist.
|
||||||
|
func (c Client) MessageMove(ctx context.Context, req MessageMoveRequest) (resp MessageMoveResult, err error) {
|
||||||
|
return transact[MessageMoveResult](ctx, c, "MessageMove", req)
|
||||||
|
}
|
367
webapi/doc.go
Normal file
367
webapi/doc.go
Normal file
|
@ -0,0 +1,367 @@
|
||||||
|
// NOTE: DO NOT EDIT, this file is generated by gendoc.sh.
|
||||||
|
|
||||||
|
/*
|
||||||
|
Package webapi implements a simple HTTP/JSON-based API for interacting with
|
||||||
|
email, and webhooks for notifications about incoming and outgoing deliveries,
|
||||||
|
including delivery failures.
|
||||||
|
|
||||||
|
# Overview
|
||||||
|
|
||||||
|
The webapi can be used to compose and send outgoing messages. The HTTP/JSON
|
||||||
|
API is often easier to use for developers since it doesn't require separate
|
||||||
|
libraries and/or having (detailed) knowledge about the format of email messages
|
||||||
|
("Internet Message Format"), or the SMTP protocol and its extensions.
|
||||||
|
|
||||||
|
Webhooks can be configured per account, and help with automated processing of
|
||||||
|
incoming email, and with handling delivery failures/success. Webhooks are
|
||||||
|
often easier to use for developers than monitoring a mailbox with IMAP and
|
||||||
|
processing new incoming email and delivery status notification (DSN) messages.
|
||||||
|
|
||||||
|
# Webapi
|
||||||
|
|
||||||
|
The webapi has a base URL at /webapi/v0/ by default, but configurable, which
|
||||||
|
serves an introduction that points to this documentation and lists the API
|
||||||
|
methods available.
|
||||||
|
|
||||||
|
An HTTP POST to /webapi/v0/<method> calls a method.The form can be either
|
||||||
|
"application/x-www-form-urlencoded" or "multipart/form-data". Form field
|
||||||
|
"request" must contain the request parameters, encoded as JSON.
|
||||||
|
|
||||||
|
HTTP basic authentication is required for calling methods, with an email
|
||||||
|
address as user name. Use a login address configured for "unique SMTP MAIL
|
||||||
|
FROM" addresses, and configure a period to "keep retired messages delivered
|
||||||
|
from the queue" for automatic suppression list management.
|
||||||
|
|
||||||
|
HTTP response status 200 OK indicates a successful method call, status 400
|
||||||
|
indicates an error. The response body of an error is a JSON object with a
|
||||||
|
human-readable "Message" field, and a "Code" field for programmatic handling
|
||||||
|
(common codes: "user" or user-induced errors, "server" for server-caused
|
||||||
|
errors). Most successful calls return a JSON object, but some return data
|
||||||
|
(e.g. a raw message or an attachment of a message). See [Methods] for the
|
||||||
|
methods and and [Client] for their documentation. The first element of their
|
||||||
|
return values indicate their JSON object type or io.ReadCloser for non-JSON
|
||||||
|
data. The request and response types are converted from/to JSON. optional and
|
||||||
|
missing/empty fields/values are converted into Go zero values: zero for
|
||||||
|
numbers, empty strings, empty lists and empty objects. New fields may be added
|
||||||
|
in response objects in future versions, parsers should ignore unrecognized
|
||||||
|
fields.
|
||||||
|
|
||||||
|
An HTTP GET to a method URL serves an HTML page showing example
|
||||||
|
request/response JSON objects in a form and a button to call the method.
|
||||||
|
|
||||||
|
# Webhooks
|
||||||
|
|
||||||
|
Webhooks for outgoing delivery events and incoming deliveries are configured
|
||||||
|
per account.
|
||||||
|
|
||||||
|
A webhook is delivered by an HTTP POST with headers "X-Mox-Webhook-ID" (unique
|
||||||
|
ID of webhook) and "X-Mox-Webhook-Attempt" (number of delivery attempts,
|
||||||
|
starting at 1), and a JSON body with the webhook data. Webhook delivery
|
||||||
|
failures are retried at a schedule similar to message deliveries, until
|
||||||
|
permanent failure.
|
||||||
|
|
||||||
|
See [webhook.Outgoing] for the fields in a webhook for outgoing deliveries, and
|
||||||
|
in particular [webhook.OutgoingEvent] for the types of events.
|
||||||
|
|
||||||
|
Only the latest event for the delivery of a particular outgoing message will be
|
||||||
|
delivered, any webhooks for that message still in the queue (after failure to
|
||||||
|
deliver) are retired as superseded when a new event occurs.
|
||||||
|
|
||||||
|
Webhooks for incoming deliveries are configured separately from outgoing
|
||||||
|
deliveries. Incoming DSNs for previously sent messages do not cause a webhook
|
||||||
|
to the webhook URL for incoming messages, only to the webhook URL for outgoing
|
||||||
|
delivery events. The incoming webhook JSON payload contains the message
|
||||||
|
envelope (parsed To, Cc, Bcc, Subject and more headers), the MIME structure,
|
||||||
|
and the contents of the first text and HTML parts. See [webhook.Incoming] for
|
||||||
|
the fields in the JSON object. The full message and individual parts, including
|
||||||
|
attachments, can be retrieved using the webapi.
|
||||||
|
|
||||||
|
# Transactional email
|
||||||
|
|
||||||
|
When sending transactional emails, potentially to many recipients, it is
|
||||||
|
important to process delivery failure notifications. If messages are rejected,
|
||||||
|
or email addresses no longer exist, you should stop sending email to those
|
||||||
|
addresses. If you try to keep sending, the receiving mail servers may consider
|
||||||
|
that spammy behaviour and blocklist your mail server.
|
||||||
|
|
||||||
|
Automatic suppression list management already prevents most repeated sending
|
||||||
|
attempts. The webhooks make it easy to receive failure notifications.
|
||||||
|
|
||||||
|
To keep spam complaints about your messages a minimum, include links to
|
||||||
|
unsubscribe from future messages without requiring further actions from the
|
||||||
|
user, such as logins. Include an unsubscribe link in the footer, and include
|
||||||
|
List-* message headers, such as List-Id, List-Unsubscribe and
|
||||||
|
List-Unsubscribe-Post.
|
||||||
|
|
||||||
|
# Webapi examples
|
||||||
|
|
||||||
|
Below are examples for making webapi calls to a locally running "mox
|
||||||
|
localserve" with its default credentials.
|
||||||
|
|
||||||
|
Send a basic message:
|
||||||
|
|
||||||
|
$ curl --user mox@localhost:moxmoxmox \
|
||||||
|
--data request='{"To": [{"Address": "mox@localhost"}], "Text": "hi ☺"}' \
|
||||||
|
http://localhost:1080/webapi/v0/Send
|
||||||
|
{
|
||||||
|
"MessageID": "<kVTha0Q-a5Zh1MuTh5rUjg@localhost>",
|
||||||
|
"Submissions": [
|
||||||
|
{
|
||||||
|
"Address": "mox@localhost",
|
||||||
|
"QueueMsgID": 10010,
|
||||||
|
"FromID": "ZfV16EATHwKEufrSMo055Q"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
Send a message with files both from form upload and base64 included in JSON:
|
||||||
|
|
||||||
|
$ curl --user mox@localhost:moxmoxmox \
|
||||||
|
--form request='{"To": [{"Address": "mox@localhost"}], "Subject": "hello", "Text": "hi ☺", "HTML": "<img src=\"cid:hi\" />", "AttachedFiles": [{"Name": "img.png", "ContentType": "image/png", "Data": "bWFkZSB5b3UgbG9vayE="}]}' \
|
||||||
|
--form 'inlinefile=@hi.png;headers="Content-ID: <hi>"' \
|
||||||
|
--form attachedfile=@mox.png \
|
||||||
|
http://localhost:1080/webapi/v0/Send
|
||||||
|
{
|
||||||
|
"MessageID": "<eZ3OEEA2odXovovIxHE49g@localhost>",
|
||||||
|
"Submissions": [
|
||||||
|
{
|
||||||
|
"Address": "mox@localhost",
|
||||||
|
"QueueMsgID": 10011,
|
||||||
|
"FromID": "yWiUQ6mvJND8FRPSmc9y5A"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
Get a message in parsed form:
|
||||||
|
|
||||||
|
$ curl --user mox@localhost:moxmoxmox --data request='{"MsgID": 424}' http://localhost:1080/webapi/v0/MessageGet
|
||||||
|
{
|
||||||
|
"Message": {
|
||||||
|
"From": [
|
||||||
|
{
|
||||||
|
"Name": "mox",
|
||||||
|
"Address": "mox@localhost"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"To": [
|
||||||
|
{
|
||||||
|
"Name": "",
|
||||||
|
"Address": "mox@localhost"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"CC": [],
|
||||||
|
"BCC": [],
|
||||||
|
"ReplyTo": [],
|
||||||
|
"MessageID": "<84vCeme_yZXyDzjWDeYBpg@localhost>",
|
||||||
|
"References": [],
|
||||||
|
"Date": "2024-04-04T14:29:42+02:00",
|
||||||
|
"Subject": "hello",
|
||||||
|
"Text": "hi \u263a\n",
|
||||||
|
"HTML": ""
|
||||||
|
},
|
||||||
|
"Structure": {
|
||||||
|
"ContentType": "multipart/mixed",
|
||||||
|
"ContentTypeParams": {
|
||||||
|
"boundary": "0ee72dc30dbab2ca6f7a363844a10a9f6111fc6dd31b8ff0b261478c2c48"
|
||||||
|
},
|
||||||
|
"ContentID": "",
|
||||||
|
"DecodedSize": 0,
|
||||||
|
"Parts": [
|
||||||
|
{
|
||||||
|
"ContentType": "multipart/related",
|
||||||
|
"ContentTypeParams": {
|
||||||
|
"boundary": "b5ed0977ee2b628040f394c3f374012458379a4f3fcda5036371d761c81d"
|
||||||
|
},
|
||||||
|
"ContentID": "",
|
||||||
|
"DecodedSize": 0,
|
||||||
|
"Parts": [
|
||||||
|
{
|
||||||
|
"ContentType": "multipart/alternative",
|
||||||
|
"ContentTypeParams": {
|
||||||
|
"boundary": "3759771adede7bd191ef37f2aa0e49ff67369f4000c320f198a875e96487"
|
||||||
|
},
|
||||||
|
"ContentID": "",
|
||||||
|
"DecodedSize": 0,
|
||||||
|
"Parts": [
|
||||||
|
{
|
||||||
|
"ContentType": "text/plain",
|
||||||
|
"ContentTypeParams": {
|
||||||
|
"charset": "utf-8"
|
||||||
|
},
|
||||||
|
"ContentID": "",
|
||||||
|
"DecodedSize": 8,
|
||||||
|
"Parts": []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"ContentType": "text/html",
|
||||||
|
"ContentTypeParams": {
|
||||||
|
"charset": "us-ascii"
|
||||||
|
},
|
||||||
|
"ContentID": "",
|
||||||
|
"DecodedSize": 22,
|
||||||
|
"Parts": []
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"ContentType": "image/png",
|
||||||
|
"ContentTypeParams": {},
|
||||||
|
"ContentID": "<hi>",
|
||||||
|
"DecodedSize": 19375,
|
||||||
|
"Parts": []
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"ContentType": "image/png",
|
||||||
|
"ContentTypeParams": {},
|
||||||
|
"ContentID": "",
|
||||||
|
"DecodedSize": 14,
|
||||||
|
"Parts": []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"ContentType": "image/png",
|
||||||
|
"ContentTypeParams": {},
|
||||||
|
"ContentID": "",
|
||||||
|
"DecodedSize": 7766,
|
||||||
|
"Parts": []
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"Meta": {
|
||||||
|
"Size": 38946,
|
||||||
|
"DSN": false,
|
||||||
|
"Flags": [
|
||||||
|
"$notjunk",
|
||||||
|
"\seen"
|
||||||
|
],
|
||||||
|
"MailFrom": "",
|
||||||
|
"MailFromValidated": false,
|
||||||
|
"MsgFrom": "",
|
||||||
|
"MsgFromValidated": false,
|
||||||
|
"DKIMVerifiedDomains": [],
|
||||||
|
"RemoteIP": "",
|
||||||
|
"MailboxName": "Inbox"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Errors (with a 400 bad request HTTP status response) include a human-readable
|
||||||
|
message and a code for programmatic use:
|
||||||
|
|
||||||
|
$ curl --user mox@localhost:moxmoxmox --data request='{"MsgID": 999}' http://localhost:1080/webapi/v0/MessageGet
|
||||||
|
{
|
||||||
|
"Code": "notFound",
|
||||||
|
"Message": "message not found"
|
||||||
|
}
|
||||||
|
|
||||||
|
Get a raw, unparsed message, as bytes:
|
||||||
|
|
||||||
|
$ curl --user mox@localhost:moxmoxmox --data request='{"MsgID": 123}' http://localhost:1080/webapi/v0/MessageRawGet
|
||||||
|
[message as bytes in raw form]
|
||||||
|
|
||||||
|
Mark a message as read:
|
||||||
|
|
||||||
|
$ curl --user mox@localhost:moxmoxmox --data request='{"MsgID": 424, "Flags": ["\\Seen", "custom"]}' http://localhost:1080/webapi/v0/MessageFlagsAdd
|
||||||
|
{}
|
||||||
|
|
||||||
|
# Webhook examples
|
||||||
|
|
||||||
|
A webhook is delivered by an HTTP POST, wich headers X-Mox-Webhook-ID and
|
||||||
|
X-Mox-Webhook-Attempt and a JSON body with the data. To simulate a webhook call
|
||||||
|
for incoming messages, use:
|
||||||
|
|
||||||
|
curl -H 'X-Mox-Webhook-ID: 123' -H 'X-Mox-Webhook-Attempt: 1' --json '{...}' http://localhost/yourapp
|
||||||
|
|
||||||
|
Example webhook HTTP POST JSON body for successful outgoing delivery:
|
||||||
|
|
||||||
|
{
|
||||||
|
"Version": 0,
|
||||||
|
"Event": "delivered",
|
||||||
|
"DSN": false,
|
||||||
|
"Suppressing": false,
|
||||||
|
"QueueMsgID": 101,
|
||||||
|
"FromID": "MDEyMzQ1Njc4OWFiY2RlZg",
|
||||||
|
"MessageID": "<QnxzgulZK51utga6agH_rg@mox.example>",
|
||||||
|
"Subject": "subject of original message",
|
||||||
|
"WebhookQueued": "2024-03-27T00:00:00Z",
|
||||||
|
"SMTPCode": 250,
|
||||||
|
"SMTPEnhancedCode": "",
|
||||||
|
"Error": "",
|
||||||
|
"Extra": {}
|
||||||
|
}
|
||||||
|
|
||||||
|
Example webhook HTTP POST JSON body for failed delivery based on incoming DSN
|
||||||
|
message, with custom extra data fields (from original submission), and adding address to the suppression list:
|
||||||
|
|
||||||
|
{
|
||||||
|
"Version": 0,
|
||||||
|
"Event": "failed",
|
||||||
|
"DSN": true,
|
||||||
|
"Suppressing": true,
|
||||||
|
"QueueMsgID": 102,
|
||||||
|
"FromID": "MDEyMzQ1Njc4OWFiY2RlZg",
|
||||||
|
"MessageID": "<QnxzgulZK51utga6agH_rg@mox.example>",
|
||||||
|
"Subject": "subject of original message",
|
||||||
|
"WebhookQueued": "2024-03-27T00:00:00Z",
|
||||||
|
"SMTPCode": 554,
|
||||||
|
"SMTPEnhancedCode": "5.4.0",
|
||||||
|
"Error": "timeout connecting to host",
|
||||||
|
"Extra": {
|
||||||
|
"userid": "456"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Example JSON body for webhooks for incoming delivery of basic message:
|
||||||
|
|
||||||
|
{
|
||||||
|
"Version": 0,
|
||||||
|
"From": [
|
||||||
|
{
|
||||||
|
"Name": "",
|
||||||
|
"Address": "mox@localhost"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"To": [
|
||||||
|
{
|
||||||
|
"Name": "",
|
||||||
|
"Address": "mjl@localhost"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"CC": [],
|
||||||
|
"BCC": [],
|
||||||
|
"ReplyTo": [],
|
||||||
|
"Subject": "hi",
|
||||||
|
"MessageID": "<QnxzgulZK51utga6agH_rg@mox.example>",
|
||||||
|
"InReplyTo": "",
|
||||||
|
"References": [],
|
||||||
|
"Date": "2024-03-27T00:00:00Z",
|
||||||
|
"Text": "hello world ☺\n",
|
||||||
|
"HTML": "",
|
||||||
|
"Structure": {
|
||||||
|
"ContentType": "text/plain",
|
||||||
|
"ContentTypeParams": {
|
||||||
|
"charset": "utf-8"
|
||||||
|
},
|
||||||
|
"ContentID": "",
|
||||||
|
"DecodedSize": 17,
|
||||||
|
"Parts": []
|
||||||
|
},
|
||||||
|
"Meta": {
|
||||||
|
"MsgID": 201,
|
||||||
|
"MailFrom": "mox@localhost",
|
||||||
|
"MailFromValidated": false,
|
||||||
|
"MsgFromValidated": true,
|
||||||
|
"RcptTo": "mjl@localhost",
|
||||||
|
"DKIMVerifiedDomains": [
|
||||||
|
"localhost"
|
||||||
|
],
|
||||||
|
"RemoteIP": "127.0.0.1",
|
||||||
|
"Received": "2024-03-27T00:00:03Z",
|
||||||
|
"MailboxName": "Inbox",
|
||||||
|
"Automated": false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
*/
|
||||||
|
package webapi
|
||||||
|
|
||||||
|
// NOTE: DO NOT EDIT, this file is generated by gendoc.sh.
|
297
webapi/gendoc.sh
Executable file
297
webapi/gendoc.sh
Executable file
|
@ -0,0 +1,297 @@
|
||||||
|
#!/bin/bash
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
# this is run with .. as working directory.
|
||||||
|
|
||||||
|
# note: outgoing hook events are in ../queue/hooks.go, ../mox-/config.go, ../queue.go and ../webapi/gendoc.sh. keep in sync.
|
||||||
|
|
||||||
|
# todo: find some proper way to generate the curl commands and responses automatically...
|
||||||
|
|
||||||
|
cat <<EOF
|
||||||
|
// NOTE: DO NOT EDIT, this file is generated by gendoc.sh.
|
||||||
|
|
||||||
|
/*
|
||||||
|
Package webapi implements a simple HTTP/JSON-based API for interacting with
|
||||||
|
email, and webhooks for notifications about incoming and outgoing deliveries,
|
||||||
|
including delivery failures.
|
||||||
|
|
||||||
|
# Overview
|
||||||
|
|
||||||
|
The webapi can be used to compose and send outgoing messages. The HTTP/JSON
|
||||||
|
API is often easier to use for developers since it doesn't require separate
|
||||||
|
libraries and/or having (detailed) knowledge about the format of email messages
|
||||||
|
("Internet Message Format"), or the SMTP protocol and its extensions.
|
||||||
|
|
||||||
|
Webhooks can be configured per account, and help with automated processing of
|
||||||
|
incoming email, and with handling delivery failures/success. Webhooks are
|
||||||
|
often easier to use for developers than monitoring a mailbox with IMAP and
|
||||||
|
processing new incoming email and delivery status notification (DSN) messages.
|
||||||
|
|
||||||
|
# Webapi
|
||||||
|
|
||||||
|
The webapi has a base URL at /webapi/v0/ by default, but configurable, which
|
||||||
|
serves an introduction that points to this documentation and lists the API
|
||||||
|
methods available.
|
||||||
|
|
||||||
|
An HTTP POST to /webapi/v0/<method> calls a method.The form can be either
|
||||||
|
"application/x-www-form-urlencoded" or "multipart/form-data". Form field
|
||||||
|
"request" must contain the request parameters, encoded as JSON.
|
||||||
|
|
||||||
|
HTTP basic authentication is required for calling methods, with an email
|
||||||
|
address as user name. Use a login address configured for "unique SMTP MAIL
|
||||||
|
FROM" addresses, and configure a period to "keep retired messages delivered
|
||||||
|
from the queue" for automatic suppression list management.
|
||||||
|
|
||||||
|
HTTP response status 200 OK indicates a successful method call, status 400
|
||||||
|
indicates an error. The response body of an error is a JSON object with a
|
||||||
|
human-readable "Message" field, and a "Code" field for programmatic handling
|
||||||
|
(common codes: "user" or user-induced errors, "server" for server-caused
|
||||||
|
errors). Most successful calls return a JSON object, but some return data
|
||||||
|
(e.g. a raw message or an attachment of a message). See [Methods] for the
|
||||||
|
methods and and [Client] for their documentation. The first element of their
|
||||||
|
return values indicate their JSON object type or io.ReadCloser for non-JSON
|
||||||
|
data. The request and response types are converted from/to JSON. optional and
|
||||||
|
missing/empty fields/values are converted into Go zero values: zero for
|
||||||
|
numbers, empty strings, empty lists and empty objects. New fields may be added
|
||||||
|
in response objects in future versions, parsers should ignore unrecognized
|
||||||
|
fields.
|
||||||
|
|
||||||
|
An HTTP GET to a method URL serves an HTML page showing example
|
||||||
|
request/response JSON objects in a form and a button to call the method.
|
||||||
|
|
||||||
|
# Webhooks
|
||||||
|
|
||||||
|
Webhooks for outgoing delivery events and incoming deliveries are configured
|
||||||
|
per account.
|
||||||
|
|
||||||
|
A webhook is delivered by an HTTP POST with headers "X-Mox-Webhook-ID" (unique
|
||||||
|
ID of webhook) and "X-Mox-Webhook-Attempt" (number of delivery attempts,
|
||||||
|
starting at 1), and a JSON body with the webhook data. Webhook delivery
|
||||||
|
failures are retried at a schedule similar to message deliveries, until
|
||||||
|
permanent failure.
|
||||||
|
|
||||||
|
See [webhook.Outgoing] for the fields in a webhook for outgoing deliveries, and
|
||||||
|
in particular [webhook.OutgoingEvent] for the types of events.
|
||||||
|
|
||||||
|
Only the latest event for the delivery of a particular outgoing message will be
|
||||||
|
delivered, any webhooks for that message still in the queue (after failure to
|
||||||
|
deliver) are retired as superseded when a new event occurs.
|
||||||
|
|
||||||
|
Webhooks for incoming deliveries are configured separately from outgoing
|
||||||
|
deliveries. Incoming DSNs for previously sent messages do not cause a webhook
|
||||||
|
to the webhook URL for incoming messages, only to the webhook URL for outgoing
|
||||||
|
delivery events. The incoming webhook JSON payload contains the message
|
||||||
|
envelope (parsed To, Cc, Bcc, Subject and more headers), the MIME structure,
|
||||||
|
and the contents of the first text and HTML parts. See [webhook.Incoming] for
|
||||||
|
the fields in the JSON object. The full message and individual parts, including
|
||||||
|
attachments, can be retrieved using the webapi.
|
||||||
|
|
||||||
|
# Transactional email
|
||||||
|
|
||||||
|
When sending transactional emails, potentially to many recipients, it is
|
||||||
|
important to process delivery failure notifications. If messages are rejected,
|
||||||
|
or email addresses no longer exist, you should stop sending email to those
|
||||||
|
addresses. If you try to keep sending, the receiving mail servers may consider
|
||||||
|
that spammy behaviour and blocklist your mail server.
|
||||||
|
|
||||||
|
Automatic suppression list management already prevents most repeated sending
|
||||||
|
attempts. The webhooks make it easy to receive failure notifications.
|
||||||
|
|
||||||
|
To keep spam complaints about your messages a minimum, include links to
|
||||||
|
unsubscribe from future messages without requiring further actions from the
|
||||||
|
user, such as logins. Include an unsubscribe link in the footer, and include
|
||||||
|
List-* message headers, such as List-Id, List-Unsubscribe and
|
||||||
|
List-Unsubscribe-Post.
|
||||||
|
|
||||||
|
# Webapi examples
|
||||||
|
|
||||||
|
Below are examples for making webapi calls to a locally running "mox
|
||||||
|
localserve" with its default credentials.
|
||||||
|
|
||||||
|
Send a basic message:
|
||||||
|
|
||||||
|
\$ curl --user mox@localhost:moxmoxmox \\
|
||||||
|
--data request='{"To": [{"Address": "mox@localhost"}], "Text": "hi ☺"}' \\
|
||||||
|
http://localhost:1080/webapi/v0/Send
|
||||||
|
{
|
||||||
|
"MessageID": "<kVTha0Q-a5Zh1MuTh5rUjg@localhost>",
|
||||||
|
"Submissions": [
|
||||||
|
{
|
||||||
|
"Address": "mox@localhost",
|
||||||
|
"QueueMsgID": 10010,
|
||||||
|
"FromID": "ZfV16EATHwKEufrSMo055Q"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
Send a message with files both from form upload and base64 included in JSON:
|
||||||
|
|
||||||
|
\$ curl --user mox@localhost:moxmoxmox \\
|
||||||
|
--form request='{"To": [{"Address": "mox@localhost"}], "Subject": "hello", "Text": "hi ☺", "HTML": "<img src=\"cid:hi\" />", "AttachedFiles": [{"Name": "img.png", "ContentType": "image/png", "Data": "bWFkZSB5b3UgbG9vayE="}]}' \\
|
||||||
|
--form 'inlinefile=@hi.png;headers="Content-ID: <hi>"' \\
|
||||||
|
--form attachedfile=@mox.png \\
|
||||||
|
http://localhost:1080/webapi/v0/Send
|
||||||
|
{
|
||||||
|
"MessageID": "<eZ3OEEA2odXovovIxHE49g@localhost>",
|
||||||
|
"Submissions": [
|
||||||
|
{
|
||||||
|
"Address": "mox@localhost",
|
||||||
|
"QueueMsgID": 10011,
|
||||||
|
"FromID": "yWiUQ6mvJND8FRPSmc9y5A"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
Get a message in parsed form:
|
||||||
|
|
||||||
|
\$ curl --user mox@localhost:moxmoxmox --data request='{"MsgID": 424}' http://localhost:1080/webapi/v0/MessageGet
|
||||||
|
{
|
||||||
|
"Message": {
|
||||||
|
"From": [
|
||||||
|
{
|
||||||
|
"Name": "mox",
|
||||||
|
"Address": "mox@localhost"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"To": [
|
||||||
|
{
|
||||||
|
"Name": "",
|
||||||
|
"Address": "mox@localhost"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"CC": [],
|
||||||
|
"BCC": [],
|
||||||
|
"ReplyTo": [],
|
||||||
|
"MessageID": "<84vCeme_yZXyDzjWDeYBpg@localhost>",
|
||||||
|
"References": [],
|
||||||
|
"Date": "2024-04-04T14:29:42+02:00",
|
||||||
|
"Subject": "hello",
|
||||||
|
"Text": "hi \u263a\n",
|
||||||
|
"HTML": ""
|
||||||
|
},
|
||||||
|
"Structure": {
|
||||||
|
"ContentType": "multipart/mixed",
|
||||||
|
"ContentTypeParams": {
|
||||||
|
"boundary": "0ee72dc30dbab2ca6f7a363844a10a9f6111fc6dd31b8ff0b261478c2c48"
|
||||||
|
},
|
||||||
|
"ContentID": "",
|
||||||
|
"DecodedSize": 0,
|
||||||
|
"Parts": [
|
||||||
|
{
|
||||||
|
"ContentType": "multipart/related",
|
||||||
|
"ContentTypeParams": {
|
||||||
|
"boundary": "b5ed0977ee2b628040f394c3f374012458379a4f3fcda5036371d761c81d"
|
||||||
|
},
|
||||||
|
"ContentID": "",
|
||||||
|
"DecodedSize": 0,
|
||||||
|
"Parts": [
|
||||||
|
{
|
||||||
|
"ContentType": "multipart/alternative",
|
||||||
|
"ContentTypeParams": {
|
||||||
|
"boundary": "3759771adede7bd191ef37f2aa0e49ff67369f4000c320f198a875e96487"
|
||||||
|
},
|
||||||
|
"ContentID": "",
|
||||||
|
"DecodedSize": 0,
|
||||||
|
"Parts": [
|
||||||
|
{
|
||||||
|
"ContentType": "text/plain",
|
||||||
|
"ContentTypeParams": {
|
||||||
|
"charset": "utf-8"
|
||||||
|
},
|
||||||
|
"ContentID": "",
|
||||||
|
"DecodedSize": 8,
|
||||||
|
"Parts": []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"ContentType": "text/html",
|
||||||
|
"ContentTypeParams": {
|
||||||
|
"charset": "us-ascii"
|
||||||
|
},
|
||||||
|
"ContentID": "",
|
||||||
|
"DecodedSize": 22,
|
||||||
|
"Parts": []
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"ContentType": "image/png",
|
||||||
|
"ContentTypeParams": {},
|
||||||
|
"ContentID": "<hi>",
|
||||||
|
"DecodedSize": 19375,
|
||||||
|
"Parts": []
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"ContentType": "image/png",
|
||||||
|
"ContentTypeParams": {},
|
||||||
|
"ContentID": "",
|
||||||
|
"DecodedSize": 14,
|
||||||
|
"Parts": []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"ContentType": "image/png",
|
||||||
|
"ContentTypeParams": {},
|
||||||
|
"ContentID": "",
|
||||||
|
"DecodedSize": 7766,
|
||||||
|
"Parts": []
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"Meta": {
|
||||||
|
"Size": 38946,
|
||||||
|
"DSN": false,
|
||||||
|
"Flags": [
|
||||||
|
"\$notjunk",
|
||||||
|
"\\seen"
|
||||||
|
],
|
||||||
|
"MailFrom": "",
|
||||||
|
"MailFromValidated": false,
|
||||||
|
"MsgFrom": "",
|
||||||
|
"MsgFromValidated": false,
|
||||||
|
"DKIMVerifiedDomains": [],
|
||||||
|
"RemoteIP": "",
|
||||||
|
"MailboxName": "Inbox"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Errors (with a 400 bad request HTTP status response) include a human-readable
|
||||||
|
message and a code for programmatic use:
|
||||||
|
|
||||||
|
\$ curl --user mox@localhost:moxmoxmox --data request='{"MsgID": 999}' http://localhost:1080/webapi/v0/MessageGet
|
||||||
|
{
|
||||||
|
"Code": "notFound",
|
||||||
|
"Message": "message not found"
|
||||||
|
}
|
||||||
|
|
||||||
|
Get a raw, unparsed message, as bytes:
|
||||||
|
|
||||||
|
\$ curl --user mox@localhost:moxmoxmox --data request='{"MsgID": 123}' http://localhost:1080/webapi/v0/MessageRawGet
|
||||||
|
[message as bytes in raw form]
|
||||||
|
|
||||||
|
Mark a message as read:
|
||||||
|
|
||||||
|
\$ curl --user mox@localhost:moxmoxmox --data request='{"MsgID": 424, "Flags": ["\\\\Seen", "custom"]}' http://localhost:1080/webapi/v0/MessageFlagsAdd
|
||||||
|
{}
|
||||||
|
|
||||||
|
# Webhook examples
|
||||||
|
|
||||||
|
A webhook is delivered by an HTTP POST, wich headers X-Mox-Webhook-ID and
|
||||||
|
X-Mox-Webhook-Attempt and a JSON body with the data. To simulate a webhook call
|
||||||
|
for incoming messages, use:
|
||||||
|
|
||||||
|
curl -H 'X-Mox-Webhook-ID: 123' -H 'X-Mox-Webhook-Attempt: 1' --json '{...}' http://localhost/yourapp
|
||||||
|
|
||||||
|
EOF
|
||||||
|
|
||||||
|
for ex in $(./mox example | grep webhook); do
|
||||||
|
./mox example $ex
|
||||||
|
echo
|
||||||
|
done
|
||||||
|
|
||||||
|
cat <<EOF
|
||||||
|
*/
|
||||||
|
package webapi
|
||||||
|
// NOTE: DO NOT EDIT, this file is generated by gendoc.sh.
|
||||||
|
EOF
|
29
webapi/limitreader.go
Normal file
29
webapi/limitreader.go
Normal file
|
@ -0,0 +1,29 @@
|
||||||
|
package webapi
|
||||||
|
|
||||||
|
// similar between ../moxio/limitreader.go and ../webapi/limitreader.go
|
||||||
|
|
||||||
|
import (
|
||||||
|
"errors"
|
||||||
|
"io"
|
||||||
|
)
|
||||||
|
|
||||||
|
var errLimit = errors.New("input exceeds maximum size") // Returned by limitReader.
|
||||||
|
|
||||||
|
// limitReader reads up to Limit bytes, returning an error if more bytes are
|
||||||
|
// read. LimitReader can be used to enforce a maximum input length.
|
||||||
|
type limitReader struct {
|
||||||
|
R io.Reader
|
||||||
|
Limit int64
|
||||||
|
}
|
||||||
|
|
||||||
|
// Read reads bytes from the underlying reader.
|
||||||
|
func (r *limitReader) Read(buf []byte) (int, error) {
|
||||||
|
n, err := r.R.Read(buf)
|
||||||
|
if n > 0 {
|
||||||
|
r.Limit -= int64(n)
|
||||||
|
if r.Limit < 0 {
|
||||||
|
return 0, errLimit
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return n, err
|
||||||
|
}
|
260
webapi/webapi.go
Normal file
260
webapi/webapi.go
Normal file
|
@ -0,0 +1,260 @@
|
||||||
|
package webapi
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"io"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/mjl-/mox/webhook"
|
||||||
|
)
|
||||||
|
|
||||||
|
// todo future: we can have text and html templates, let submitters reference them along with parameters, and compose the message bodies ourselves.
|
||||||
|
// todo future: generate api specs (e.g. openapi) for webapi
|
||||||
|
// todo future: consider deprecating some of the webapi in favor of jmap
|
||||||
|
|
||||||
|
// Methods of the webapi. More methods may be added in the future. See [Client]
|
||||||
|
// for documentation.
|
||||||
|
type Methods interface {
|
||||||
|
Send(ctx context.Context, request SendRequest) (response SendResult, err error)
|
||||||
|
SuppressionList(ctx context.Context, request SuppressionListRequest) (response SuppressionListResult, err error)
|
||||||
|
SuppressionAdd(ctx context.Context, request SuppressionAddRequest) (response SuppressionAddResult, err error)
|
||||||
|
SuppressionRemove(ctx context.Context, request SuppressionRemoveRequest) (response SuppressionRemoveResult, err error)
|
||||||
|
SuppressionPresent(ctx context.Context, request SuppressionPresentRequest) (response SuppressionPresentResult, err error)
|
||||||
|
MessageGet(ctx context.Context, request MessageGetRequest) (response MessageGetResult, err error)
|
||||||
|
MessageRawGet(ctx context.Context, request MessageRawGetRequest) (response io.ReadCloser, err error)
|
||||||
|
MessagePartGet(ctx context.Context, request MessagePartGetRequest) (response io.ReadCloser, err error)
|
||||||
|
MessageDelete(ctx context.Context, request MessageDeleteRequest) (response MessageDeleteResult, err error)
|
||||||
|
MessageFlagsAdd(ctx context.Context, request MessageFlagsAddRequest) (response MessageFlagsAddResult, err error)
|
||||||
|
MessageFlagsRemove(ctx context.Context, request MessageFlagsRemoveRequest) (response MessageFlagsRemoveResult, err error)
|
||||||
|
MessageMove(ctx context.Context, request MessageMoveRequest) (response MessageMoveResult, err error)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Error indicates an API-related error.
|
||||||
|
type Error struct {
|
||||||
|
// For programmatic handling. Common values: "user" for generic error by user,
|
||||||
|
// "server" for a server-side processing error, "badAddress" for malformed email
|
||||||
|
// addresses.
|
||||||
|
Code string
|
||||||
|
|
||||||
|
// Human readable error message.
|
||||||
|
Message string
|
||||||
|
}
|
||||||
|
|
||||||
|
// Error returns the human-readable error message.
|
||||||
|
func (e Error) Error() string {
|
||||||
|
return e.Message
|
||||||
|
}
|
||||||
|
|
||||||
|
type NameAddress struct {
|
||||||
|
Name string // Optional, human-readable "display name" of the addressee.
|
||||||
|
Address string // Required, email address.
|
||||||
|
}
|
||||||
|
|
||||||
|
// Message is an email message, used both for outgoing submitted messages and
|
||||||
|
// incoming messages.
|
||||||
|
type Message struct {
|
||||||
|
// For sending, if empty, automatically filled based on authenticated user and
|
||||||
|
// account information. Outgoing messages are allowed maximum 1 From address,
|
||||||
|
// incoming messages can in theory have zero or multiple, but typically have just
|
||||||
|
// one.
|
||||||
|
From []NameAddress
|
||||||
|
|
||||||
|
// To/Cc/Bcc message headers. Outgoing messages are sent to all these addresses.
|
||||||
|
// All are optional, but there should be at least one addressee.
|
||||||
|
To []NameAddress
|
||||||
|
CC []NameAddress
|
||||||
|
// For submissions, BCC addressees receive the message but are not added to the
|
||||||
|
// message headers. For incoming messages, this is typically empty.
|
||||||
|
BCC []NameAddress
|
||||||
|
|
||||||
|
// Optional Reply-To header, where the recipient is asked to send replies to.
|
||||||
|
ReplyTo []NameAddress
|
||||||
|
|
||||||
|
// Message-ID from message header, should be wrapped in <>'s. For outgoing
|
||||||
|
// messages, a unique message-id is generated if empty.
|
||||||
|
MessageID string
|
||||||
|
|
||||||
|
// Optional. References to message-id's (including <>) of other messages, if this
|
||||||
|
// is a reply or forwarded message. References are from oldest (ancestor) to most
|
||||||
|
// recent message. For outgoing messages, if non-empty then In-Reply-To is set to
|
||||||
|
// the last element.
|
||||||
|
References []string
|
||||||
|
|
||||||
|
// Optional, set to time of submission for outgoing messages if nil.
|
||||||
|
Date *time.Time
|
||||||
|
|
||||||
|
// Subject header, optional.
|
||||||
|
Subject string
|
||||||
|
|
||||||
|
// For outgoing messages, at least text or HTML must be non-empty. If both are
|
||||||
|
// present, a multipart/alternative part is created. Lines must be
|
||||||
|
// \n-separated, automatically replaced with \r\n when composing the message.
|
||||||
|
// For parsed, incoming messages, values are truncated to 1MB (1024*1024 bytes).
|
||||||
|
// Use MessagePartGet to retrieve the full part data.
|
||||||
|
Text string
|
||||||
|
HTML string
|
||||||
|
}
|
||||||
|
|
||||||
|
// SendRequest submits a message to be delivered.
|
||||||
|
type SendRequest struct {
|
||||||
|
// Message with headers and contents to compose. Additional headers and files can
|
||||||
|
// be added too (see below, and the use of multipart/form-data requests). The
|
||||||
|
// fields of Message are included directly in SendRequest. Required.
|
||||||
|
Message
|
||||||
|
|
||||||
|
// Metadata to associate with the delivery, through the queue, including webhooks
|
||||||
|
// about delivery events. Metadata can also be set with regular SMTP submission
|
||||||
|
// through message headers "X-Mox-Extra-<key>: <value>". Current behaviour is as
|
||||||
|
// follows, but this may change: 1. Keys are canonicalized, each dash-separated
|
||||||
|
// word changed to start with a capital. 2. Keys cannot be duplicated. 3. These
|
||||||
|
// headers are not removed when delivering.
|
||||||
|
Extra map[string]string
|
||||||
|
|
||||||
|
// Additional custom headers to include in outgoing message. Optional.
|
||||||
|
// Unless a User-Agent or X-Mailer header is present, a User-Agent is added.
|
||||||
|
Headers [][2]string
|
||||||
|
|
||||||
|
// Inline files are added to the message and should be displayed by mail clients as
|
||||||
|
// part of the message contents. Inline files cause a part with content-type
|
||||||
|
// "multipart/related" to be added to the message. Optional.
|
||||||
|
InlineFiles []File
|
||||||
|
|
||||||
|
// Attached files are added to the message and should be shown as files that can be
|
||||||
|
// saved. Attached files cause a part with content-type "multipart/mixed" to be
|
||||||
|
// added to the message. Optional.
|
||||||
|
AttachedFiles []File
|
||||||
|
|
||||||
|
// If absent/null, regular TLS requirements apply (opportunistic TLS, DANE,
|
||||||
|
// MTA-STS). If true, the SMTP REQUIRETLS extension is required, enforcing verified
|
||||||
|
// TLS along the delivery path. If false, TLS requirements are relaxed and
|
||||||
|
// DANE/MTA-STS policies may be ignored to increase the odds of successful but
|
||||||
|
// insecure delivery. Optional.
|
||||||
|
RequireTLS *bool
|
||||||
|
|
||||||
|
// If set, it should be a time in the future at which the first delivery attempt
|
||||||
|
// starts. Optional.
|
||||||
|
FutureRelease *time.Time
|
||||||
|
|
||||||
|
// Whether to store outgoing message in designated Sent mailbox (if configured).
|
||||||
|
SaveSent bool
|
||||||
|
}
|
||||||
|
|
||||||
|
type File struct {
|
||||||
|
Name string // Optional.
|
||||||
|
ContentType string // E.g. application/pdf or image/png, automatically detected if empty.
|
||||||
|
ContentID string // E.g. "<randomid>", for use in html email with "cid:<randomid>". Optional.
|
||||||
|
Data string // Base64-encoded contents of the file. Required.
|
||||||
|
}
|
||||||
|
|
||||||
|
// MessageMeta is returned as part of MessageGet.
|
||||||
|
type MessageMeta struct {
|
||||||
|
Size int64 // Total size of raw message file.
|
||||||
|
DSN bool // Whether this message is a DSN.
|
||||||
|
Flags []string // Standard message flags like \seen, \answered, $forwarded, $junk, $nonjunk, and custom keywords.
|
||||||
|
MailFrom string // Address used during SMTP "MAIL FROM" command.
|
||||||
|
MailFromValidated bool // Whether SMTP MAIL FROM address was SPF-validated.
|
||||||
|
MsgFrom string // Address used in message "From" header.
|
||||||
|
MsgFromValidated bool // Whether address in message "From"-header was DMARC(-like) validated.
|
||||||
|
DKIMVerifiedDomains []string // Verified domains from DKIM-signature in message. Can be different domain than used in addresses.
|
||||||
|
RemoteIP string // Where the message was delivered from.
|
||||||
|
MailboxName string
|
||||||
|
}
|
||||||
|
|
||||||
|
type SendResult struct {
|
||||||
|
MessageID string // "<random>@<domain>", as added by submitter or automatically generated during submission.
|
||||||
|
Submissions []Submission // Messages submitted to queue for delivery. In order of To, CC, BCC fields in request.
|
||||||
|
}
|
||||||
|
|
||||||
|
type Submission struct {
|
||||||
|
Address string // From original recipient (to/cc/bcc).
|
||||||
|
QueueMsgID int64 // Of message added to delivery queue, later webhook calls reference this same ID.
|
||||||
|
FromID string // Unique ID used during delivery, later webhook calls reference this same FromID.
|
||||||
|
}
|
||||||
|
|
||||||
|
// Suppression is an address to which messages will not be delivered. Attempts to
|
||||||
|
// deliver or queue will result in an immediate permanent failure to deliver.
|
||||||
|
type Suppression struct {
|
||||||
|
ID int64
|
||||||
|
Created time.Time `bstore:"default now"`
|
||||||
|
|
||||||
|
// Suppression applies to this account only.
|
||||||
|
Account string `bstore:"nonzero,unique Account+BaseAddress"`
|
||||||
|
|
||||||
|
// Unicode. Address with fictional simplified localpart: lowercase, dots removed
|
||||||
|
// (gmail), first token before any "-" or "+" (typical catchall separator).
|
||||||
|
BaseAddress string `bstore:"nonzero"`
|
||||||
|
|
||||||
|
// Unicode. Address that caused this suppression.
|
||||||
|
OriginalAddress string `bstore:"nonzero"`
|
||||||
|
|
||||||
|
Manual bool
|
||||||
|
Reason string
|
||||||
|
}
|
||||||
|
|
||||||
|
type SuppressionListRequest struct{}
|
||||||
|
type SuppressionListResult struct {
|
||||||
|
Suppressions []Suppression // Current suppressed addresses for account.
|
||||||
|
}
|
||||||
|
|
||||||
|
type SuppressionAddRequest struct {
|
||||||
|
EmailAddress string
|
||||||
|
Manual bool // Whether added manually or automatically.
|
||||||
|
Reason string // Free-form text.
|
||||||
|
}
|
||||||
|
type SuppressionAddResult struct{}
|
||||||
|
|
||||||
|
type SuppressionRemoveRequest struct {
|
||||||
|
EmailAddress string
|
||||||
|
}
|
||||||
|
type SuppressionRemoveResult struct{}
|
||||||
|
|
||||||
|
type SuppressionPresentRequest struct {
|
||||||
|
EmailAddress string
|
||||||
|
}
|
||||||
|
type SuppressionPresentResult struct {
|
||||||
|
Present bool
|
||||||
|
}
|
||||||
|
|
||||||
|
type MessageGetRequest struct {
|
||||||
|
MsgID int64
|
||||||
|
}
|
||||||
|
type MessageGetResult struct {
|
||||||
|
Message Message
|
||||||
|
Structure webhook.Structure // MIME structure.
|
||||||
|
Meta MessageMeta // Additional information about message and SMTP delivery.
|
||||||
|
}
|
||||||
|
|
||||||
|
type MessageRawGetRequest struct {
|
||||||
|
MsgID int64
|
||||||
|
}
|
||||||
|
|
||||||
|
type MessagePartGetRequest struct {
|
||||||
|
MsgID int64
|
||||||
|
|
||||||
|
// Indexes into MIME parts, e.g. [0, 2] first dereferences the first element in a
|
||||||
|
// multipart message, then the 3rd part within that first element.
|
||||||
|
PartPath []int
|
||||||
|
}
|
||||||
|
|
||||||
|
type MessageDeleteRequest struct {
|
||||||
|
MsgID int64
|
||||||
|
}
|
||||||
|
type MessageDeleteResult struct{}
|
||||||
|
|
||||||
|
type MessageFlagsAddRequest struct {
|
||||||
|
MsgID int64
|
||||||
|
Flags []string // Standard message flags like \seen, \answered, $forwarded, $junk, $nonjunk, and custom keywords.
|
||||||
|
}
|
||||||
|
type MessageFlagsAddResult struct{}
|
||||||
|
|
||||||
|
type MessageFlagsRemoveRequest struct {
|
||||||
|
MsgID int64
|
||||||
|
Flags []string
|
||||||
|
}
|
||||||
|
type MessageFlagsRemoveResult struct{}
|
||||||
|
|
||||||
|
type MessageMoveRequest struct {
|
||||||
|
MsgID int64
|
||||||
|
DestMailboxName string // E.g. "Inbox", must already exist.
|
||||||
|
}
|
||||||
|
type MessageMoveResult struct{}
|
1330
webapisrv/server.go
Normal file
1330
webapisrv/server.go
Normal file
File diff suppressed because it is too large
Load diff
491
webapisrv/server_test.go
Normal file
491
webapisrv/server_test.go
Normal file
|
@ -0,0 +1,491 @@
|
||||||
|
package webapisrv
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"context"
|
||||||
|
"encoding/base64"
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"mime/multipart"
|
||||||
|
"net/http"
|
||||||
|
"net/http/httptest"
|
||||||
|
"net/textproto"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"reflect"
|
||||||
|
"slices"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/mjl-/mox/message"
|
||||||
|
"github.com/mjl-/mox/mlog"
|
||||||
|
"github.com/mjl-/mox/mox-"
|
||||||
|
"github.com/mjl-/mox/queue"
|
||||||
|
"github.com/mjl-/mox/store"
|
||||||
|
"github.com/mjl-/mox/webapi"
|
||||||
|
"github.com/mjl-/mox/webhook"
|
||||||
|
)
|
||||||
|
|
||||||
|
var ctxbg = context.Background()
|
||||||
|
|
||||||
|
func tcheckf(t *testing.T, err error, format string, args ...any) {
|
||||||
|
t.Helper()
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("%s: %s", fmt.Sprintf(format, args...), err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func tcompare(t *testing.T, got, expect any) {
|
||||||
|
t.Helper()
|
||||||
|
if !reflect.DeepEqual(got, expect) {
|
||||||
|
t.Fatalf("got:\n%#v\nexpected:\n%#v", got, expect)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func terrcode(t *testing.T, err error, code string) {
|
||||||
|
t.Helper()
|
||||||
|
if err == nil {
|
||||||
|
t.Fatalf("no error, expected error with code %q", code)
|
||||||
|
}
|
||||||
|
if xerr, ok := err.(webapi.Error); !ok {
|
||||||
|
t.Fatalf("got %v, expected webapi error with code %q", err, code)
|
||||||
|
} else if xerr.Code != code {
|
||||||
|
t.Fatalf("got error code %q, expected %q", xerr.Code, code)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestServer(t *testing.T) {
|
||||||
|
mox.LimitersInit()
|
||||||
|
os.RemoveAll("../testdata/webapisrv/data")
|
||||||
|
mox.Context = ctxbg
|
||||||
|
mox.ConfigStaticPath = filepath.FromSlash("../testdata/webapisrv/mox.conf")
|
||||||
|
mox.MustLoadConfig(true, false)
|
||||||
|
defer store.Switchboard()()
|
||||||
|
err := queue.Init()
|
||||||
|
tcheckf(t, err, "queue init")
|
||||||
|
|
||||||
|
log := mlog.New("webapisrv", nil)
|
||||||
|
acc, err := store.OpenAccount(log, "mjl")
|
||||||
|
tcheckf(t, err, "open account")
|
||||||
|
const pw0 = "te\u0301st \u00a0\u2002\u200a" // NFD and various unicode spaces.
|
||||||
|
const pw1 = "tést " // PRECIS normalized, with NFC.
|
||||||
|
err = acc.SetPassword(log, pw0)
|
||||||
|
tcheckf(t, err, "set password")
|
||||||
|
defer func() {
|
||||||
|
err := acc.Close()
|
||||||
|
log.Check(err, "closing account")
|
||||||
|
}()
|
||||||
|
|
||||||
|
s := NewServer(100*1024, "/webapi/", false).(server)
|
||||||
|
hs := httptest.NewServer(s)
|
||||||
|
defer hs.Close()
|
||||||
|
|
||||||
|
// server expects the mount path to be stripped already.
|
||||||
|
client := webapi.Client{BaseURL: hs.URL + "/v0/", Username: "mjl@mox.example", Password: pw0}
|
||||||
|
|
||||||
|
testHTTPHdrsBody := func(s server, method, path string, headers map[string]string, body string, expCode int, expTooMany bool, expCT, expErrCode string) {
|
||||||
|
t.Helper()
|
||||||
|
|
||||||
|
r := httptest.NewRequest(method, path, strings.NewReader(body))
|
||||||
|
for k, v := range headers {
|
||||||
|
r.Header.Set(k, v)
|
||||||
|
}
|
||||||
|
w := httptest.NewRecorder()
|
||||||
|
s.ServeHTTP(w, r)
|
||||||
|
res := w.Result()
|
||||||
|
if res.StatusCode != http.StatusTooManyRequests || !expTooMany {
|
||||||
|
tcompare(t, res.StatusCode, expCode)
|
||||||
|
}
|
||||||
|
if expCT != "" {
|
||||||
|
tcompare(t, res.Header.Get("Content-Type"), expCT)
|
||||||
|
}
|
||||||
|
if expErrCode != "" {
|
||||||
|
dec := json.NewDecoder(res.Body)
|
||||||
|
dec.DisallowUnknownFields()
|
||||||
|
var apierr webapi.Error
|
||||||
|
err := dec.Decode(&apierr)
|
||||||
|
tcheckf(t, err, "decoding json error")
|
||||||
|
tcompare(t, apierr.Code, expErrCode)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
testHTTP := func(method, path string, expCode int, expCT string) {
|
||||||
|
t.Helper()
|
||||||
|
testHTTPHdrsBody(s, method, path, nil, "", expCode, false, expCT, "")
|
||||||
|
}
|
||||||
|
|
||||||
|
testHTTP("GET", "/", http.StatusSeeOther, "")
|
||||||
|
testHTTP("POST", "/", http.StatusMethodNotAllowed, "")
|
||||||
|
testHTTP("GET", "/v0/", http.StatusOK, "text/html; charset=utf-8")
|
||||||
|
testHTTP("GET", "/other/", http.StatusNotFound, "")
|
||||||
|
testHTTP("GET", "/v0/Send", http.StatusOK, "text/html; charset=utf-8")
|
||||||
|
testHTTP("GET", "/v0/MessageRawGet", http.StatusOK, "text/html; charset=utf-8")
|
||||||
|
testHTTP("GET", "/v0/Bogus", http.StatusNotFound, "")
|
||||||
|
testHTTP("PUT", "/v0/Send", http.StatusMethodNotAllowed, "")
|
||||||
|
testHTTP("POST", "/v0/Send", http.StatusUnauthorized, "")
|
||||||
|
|
||||||
|
for i := 0; i < 11; i++ {
|
||||||
|
// Missing auth doesn't trigger auth rate limiter.
|
||||||
|
testHTTP("POST", "/v0/Send", http.StatusUnauthorized, "")
|
||||||
|
}
|
||||||
|
for i := 0; i < 21; i++ {
|
||||||
|
// Bad auth does.
|
||||||
|
expCode := http.StatusUnauthorized
|
||||||
|
tooMany := i >= 10
|
||||||
|
if i == 20 {
|
||||||
|
expCode = http.StatusTooManyRequests
|
||||||
|
}
|
||||||
|
testHTTPHdrsBody(s, "POST", "/v0/Send", map[string]string{"Authorization": "Basic " + base64.StdEncoding.EncodeToString([]byte("mjl@mox.example:badpassword"))}, "", expCode, tooMany, "", "")
|
||||||
|
}
|
||||||
|
mox.LimitersInit()
|
||||||
|
|
||||||
|
// Request with missing X-Forwarded-For.
|
||||||
|
sfwd := NewServer(100*1024, "/webapi/", true).(server)
|
||||||
|
testHTTPHdrsBody(sfwd, "POST", "/v0/Send", map[string]string{"Authorization": "Basic " + base64.StdEncoding.EncodeToString([]byte("mjl@mox.example:badpassword"))}, "", http.StatusInternalServerError, false, "", "")
|
||||||
|
|
||||||
|
// Body must be form, not JSON.
|
||||||
|
authz := "Basic " + base64.StdEncoding.EncodeToString([]byte("mjl@mox.example:"+pw1))
|
||||||
|
testHTTPHdrsBody(s, "POST", "/v0/Send", map[string]string{"Content-Type": "application/json", "Authorization": authz}, "{}", http.StatusBadRequest, false, "application/json; charset=utf-8", "protocol")
|
||||||
|
testHTTPHdrsBody(s, "POST", "/v0/Send", map[string]string{"Content-Type": "multipart/form-data", "Authorization": authz}, "not formdata", http.StatusBadRequest, false, "application/json; charset=utf-8", "protocol")
|
||||||
|
formAuth := map[string]string{
|
||||||
|
"Content-Type": "application/x-www-form-urlencoded",
|
||||||
|
"Authorization": authz,
|
||||||
|
}
|
||||||
|
testHTTPHdrsBody(s, "POST", "/v0/Send", formAuth, "not encoded\n\n", http.StatusBadRequest, false, "application/json; charset=utf-8", "protocol")
|
||||||
|
// Missing "request".
|
||||||
|
testHTTPHdrsBody(s, "POST", "/v0/Send", formAuth, "", http.StatusBadRequest, false, "application/json; charset=utf-8", "protocol")
|
||||||
|
// "request" must be JSON.
|
||||||
|
testHTTPHdrsBody(s, "POST", "/v0/Send", formAuth, "request=notjson", http.StatusBadRequest, false, "application/json; charset=utf-8", "protocol")
|
||||||
|
// "request" must be JSON object.
|
||||||
|
testHTTPHdrsBody(s, "POST", "/v0/Send", formAuth, "request=[]", http.StatusBadRequest, false, "application/json; charset=utf-8", "protocol")
|
||||||
|
|
||||||
|
// Send message. Look for the message in the queue.
|
||||||
|
now := time.Now()
|
||||||
|
yes := true
|
||||||
|
sendReq := webapi.SendRequest{
|
||||||
|
Message: webapi.Message{
|
||||||
|
From: []webapi.NameAddress{{Name: "møx", Address: "mjl@mox.example"}},
|
||||||
|
To: []webapi.NameAddress{{Name: "móx", Address: "mjl+to@mox.example"}, {Address: "mjl+to2@mox.example"}},
|
||||||
|
CC: []webapi.NameAddress{{Name: "möx", Address: "mjl+cc@mox.example"}},
|
||||||
|
BCC: []webapi.NameAddress{{Name: "møx", Address: "mjl+bcc@mox.example"}},
|
||||||
|
ReplyTo: []webapi.NameAddress{{Name: "reply1", Address: "mox+reply1@mox.example"}, {Name: "reply2", Address: "mox+reply2@mox.example"}},
|
||||||
|
MessageID: "<random@localhost>",
|
||||||
|
References: []string{"<messageid0@localhost>", "<messageid1@localhost>"},
|
||||||
|
Date: &now,
|
||||||
|
Subject: "¡hello world!",
|
||||||
|
Text: "hi ☺\n",
|
||||||
|
HTML: `<html><img src="cid:x" /></html>`, // Newline will be added.
|
||||||
|
},
|
||||||
|
Extra: map[string]string{"a": "123"},
|
||||||
|
Headers: [][2]string{{"x-custom", "header"}},
|
||||||
|
InlineFiles: []webapi.File{
|
||||||
|
{
|
||||||
|
Name: "x.png",
|
||||||
|
ContentType: "image/png",
|
||||||
|
ContentID: "<x>",
|
||||||
|
Data: base64.StdEncoding.EncodeToString([]byte("png data")),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
AttachedFiles: []webapi.File{
|
||||||
|
{
|
||||||
|
Data: base64.StdEncoding.EncodeToString([]byte("%PDF-")), // Should be detected as PDF.
|
||||||
|
},
|
||||||
|
},
|
||||||
|
RequireTLS: &yes,
|
||||||
|
FutureRelease: &now,
|
||||||
|
SaveSent: true,
|
||||||
|
}
|
||||||
|
sendResp, err := client.Send(ctxbg, sendReq)
|
||||||
|
tcheckf(t, err, "send message")
|
||||||
|
tcompare(t, sendResp.MessageID, sendReq.Message.MessageID)
|
||||||
|
tcompare(t, len(sendResp.Submissions), 2+1+1) // 2 to, 1 cc, 1 bcc
|
||||||
|
subs := sendResp.Submissions
|
||||||
|
tcompare(t, subs[0].Address, "mjl+to@mox.example")
|
||||||
|
tcompare(t, subs[1].Address, "mjl+to2@mox.example")
|
||||||
|
tcompare(t, subs[2].Address, "mjl+cc@mox.example")
|
||||||
|
tcompare(t, subs[3].Address, "mjl+bcc@mox.example")
|
||||||
|
tcompare(t, subs[3].QueueMsgID, subs[0].QueueMsgID+3)
|
||||||
|
tcompare(t, subs[0].FromID, "")
|
||||||
|
// todo: look in queue for parameters. parse the message.
|
||||||
|
|
||||||
|
// Send a custom multipart/form-data POST, with different request parameters, and
|
||||||
|
// additional files.
|
||||||
|
var sb strings.Builder
|
||||||
|
mp := multipart.NewWriter(&sb)
|
||||||
|
fdSendReq := webapi.SendRequest{
|
||||||
|
Message: webapi.Message{
|
||||||
|
To: []webapi.NameAddress{{Address: "møx@mox.example"}},
|
||||||
|
// Let server assign date, message-id.
|
||||||
|
Subject: "test",
|
||||||
|
Text: "hi",
|
||||||
|
},
|
||||||
|
// Don't let server add its own user-agent.
|
||||||
|
Headers: [][2]string{{"User-Agent", "test"}},
|
||||||
|
}
|
||||||
|
sendReqBuf, err := json.Marshal(fdSendReq)
|
||||||
|
tcheckf(t, err, "send request")
|
||||||
|
mp.WriteField("request", string(sendReqBuf))
|
||||||
|
// Two inline PDFs.
|
||||||
|
pw, err := mp.CreateFormFile("inlinefile", "test.pdf")
|
||||||
|
tcheckf(t, err, "create inline pdf file")
|
||||||
|
_, err = fmt.Fprint(pw, "%PDF-")
|
||||||
|
tcheckf(t, err, "write pdf")
|
||||||
|
pw, err = mp.CreateFormFile("inlinefile", "test.pdf")
|
||||||
|
tcheckf(t, err, "create second inline pdf file")
|
||||||
|
_, err = fmt.Fprint(pw, "%PDF-")
|
||||||
|
tcheckf(t, err, "write second pdf")
|
||||||
|
|
||||||
|
// One attached PDF.
|
||||||
|
fh := textproto.MIMEHeader{}
|
||||||
|
fh.Set("Content-Disposition", `form-data; name="attachedfile"; filename="test.pdf"`)
|
||||||
|
fh.Set("Content-ID", "<testpdf>")
|
||||||
|
pw, err = mp.CreatePart(fh)
|
||||||
|
tcheckf(t, err, "create attached pdf file")
|
||||||
|
_, err = fmt.Fprint(pw, "%PDF-")
|
||||||
|
tcheckf(t, err, "write attached pdf")
|
||||||
|
fdct := mp.FormDataContentType()
|
||||||
|
err = mp.Close()
|
||||||
|
tcheckf(t, err, "close multipart")
|
||||||
|
|
||||||
|
// Perform custom POST.
|
||||||
|
req, err := http.NewRequest("POST", hs.URL+"/v0/Send", strings.NewReader(sb.String()))
|
||||||
|
tcheckf(t, err, "new request")
|
||||||
|
req.Header.Set("Content-Type", fdct)
|
||||||
|
// Use a unique MAIL FROM id when delivering.
|
||||||
|
req.Header.Set("Authorization", "Basic "+base64.StdEncoding.EncodeToString([]byte("mjl+fromid@mox.example:"+pw1)))
|
||||||
|
resp, err := http.DefaultClient.Do(req)
|
||||||
|
tcheckf(t, err, "request multipart/form-data")
|
||||||
|
tcompare(t, resp.StatusCode, http.StatusOK)
|
||||||
|
var sendRes webapi.SendResult
|
||||||
|
err = json.NewDecoder(resp.Body).Decode(&sendRes)
|
||||||
|
tcheckf(t, err, "parse send response")
|
||||||
|
tcompare(t, sendRes.MessageID != "", true)
|
||||||
|
tcompare(t, len(sendRes.Submissions), 1)
|
||||||
|
tcompare(t, sendRes.Submissions[0].FromID != "", true)
|
||||||
|
|
||||||
|
// Trigger various error conditions.
|
||||||
|
_, err = client.Send(ctxbg, webapi.SendRequest{
|
||||||
|
Message: webapi.Message{
|
||||||
|
To: []webapi.NameAddress{{Address: "mjl@mox.example"}},
|
||||||
|
Subject: "test",
|
||||||
|
},
|
||||||
|
})
|
||||||
|
terrcode(t, err, "missingBody")
|
||||||
|
|
||||||
|
_, err = client.Send(ctxbg, webapi.SendRequest{
|
||||||
|
Message: webapi.Message{
|
||||||
|
From: []webapi.NameAddress{{Address: "other@mox.example"}},
|
||||||
|
To: []webapi.NameAddress{{Address: "mjl@mox.example"}},
|
||||||
|
Subject: "test",
|
||||||
|
Text: "hi",
|
||||||
|
},
|
||||||
|
})
|
||||||
|
terrcode(t, err, "badFrom")
|
||||||
|
|
||||||
|
_, err = client.Send(ctxbg, webapi.SendRequest{
|
||||||
|
Message: webapi.Message{
|
||||||
|
From: []webapi.NameAddress{{Address: "mox@mox.example"}, {Address: "mox@mox.example"}},
|
||||||
|
To: []webapi.NameAddress{{Address: "mjl@mox.example"}},
|
||||||
|
Subject: "test",
|
||||||
|
Text: "hi",
|
||||||
|
},
|
||||||
|
})
|
||||||
|
terrcode(t, err, "multipleFrom")
|
||||||
|
|
||||||
|
_, err = client.Send(ctxbg, webapi.SendRequest{Message: webapi.Message{Subject: "test", Text: "hi"}})
|
||||||
|
terrcode(t, err, "noRecipients")
|
||||||
|
|
||||||
|
_, err = client.Send(ctxbg, webapi.SendRequest{
|
||||||
|
Message: webapi.Message{
|
||||||
|
MessageID: "missingltgt@localhost",
|
||||||
|
To: []webapi.NameAddress{{Address: "møx@mox.example"}},
|
||||||
|
Subject: "test",
|
||||||
|
Text: "hi",
|
||||||
|
},
|
||||||
|
})
|
||||||
|
terrcode(t, err, "malformedMessageID")
|
||||||
|
|
||||||
|
_, err = client.Send(ctxbg, webapi.SendRequest{
|
||||||
|
Message: webapi.Message{
|
||||||
|
MessageID: "missingltgt@localhost",
|
||||||
|
To: []webapi.NameAddress{{Address: "møx@mox.example"}},
|
||||||
|
Subject: "test",
|
||||||
|
Text: "hi",
|
||||||
|
},
|
||||||
|
})
|
||||||
|
terrcode(t, err, "malformedMessageID")
|
||||||
|
|
||||||
|
// todo: messageLimitReached, recipientLimitReached
|
||||||
|
|
||||||
|
// SuppressionList
|
||||||
|
supListRes, err := client.SuppressionList(ctxbg, webapi.SuppressionListRequest{})
|
||||||
|
tcheckf(t, err, "listing suppressions")
|
||||||
|
tcompare(t, len(supListRes.Suppressions), 0)
|
||||||
|
|
||||||
|
// SuppressionAdd
|
||||||
|
supAddReq := webapi.SuppressionAddRequest{EmailAddress: "Remote.Last-catchall@xn--74h.localhost", Manual: true, Reason: "tests"}
|
||||||
|
_, err = client.SuppressionAdd(ctxbg, supAddReq)
|
||||||
|
tcheckf(t, err, "add address to suppression list")
|
||||||
|
_, err = client.SuppressionAdd(ctxbg, supAddReq)
|
||||||
|
terrcode(t, err, "error") // Already present.
|
||||||
|
supAddReq2 := webapi.SuppressionAddRequest{EmailAddress: "remotelast@☺.localhost", Manual: false, Reason: "tests"}
|
||||||
|
_, err = client.SuppressionAdd(ctxbg, supAddReq2)
|
||||||
|
terrcode(t, err, "error") // Already present, same base address.
|
||||||
|
supAddReq3 := webapi.SuppressionAddRequest{EmailAddress: "not an address"}
|
||||||
|
_, err = client.SuppressionAdd(ctxbg, supAddReq3)
|
||||||
|
terrcode(t, err, "badAddress")
|
||||||
|
|
||||||
|
supListRes, err = client.SuppressionList(ctxbg, webapi.SuppressionListRequest{})
|
||||||
|
tcheckf(t, err, "listing suppressions")
|
||||||
|
tcompare(t, len(supListRes.Suppressions), 1)
|
||||||
|
supListRes.Suppressions[0].Created = now
|
||||||
|
tcompare(t, supListRes.Suppressions, []webapi.Suppression{
|
||||||
|
{
|
||||||
|
ID: 1,
|
||||||
|
Created: now,
|
||||||
|
Account: "mjl",
|
||||||
|
BaseAddress: "remotelast@☺.localhost",
|
||||||
|
OriginalAddress: "Remote.Last-catchall@☺.localhost",
|
||||||
|
Manual: true,
|
||||||
|
Reason: "tests",
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
// SuppressionPresent
|
||||||
|
supPresRes, err := client.SuppressionPresent(ctxbg, webapi.SuppressionPresentRequest{EmailAddress: "not@localhost"})
|
||||||
|
tcheckf(t, err, "address present")
|
||||||
|
tcompare(t, supPresRes.Present, false)
|
||||||
|
supPresRes, err = client.SuppressionPresent(ctxbg, webapi.SuppressionPresentRequest{EmailAddress: "remotelast@xn--74h.localhost"})
|
||||||
|
tcheckf(t, err, "address present")
|
||||||
|
tcompare(t, supPresRes.Present, true)
|
||||||
|
supPresRes, err = client.SuppressionPresent(ctxbg, webapi.SuppressionPresentRequest{EmailAddress: "Remote.Last-catchall@☺.localhost"})
|
||||||
|
tcheckf(t, err, "address present")
|
||||||
|
tcompare(t, supPresRes.Present, true)
|
||||||
|
supPresRes, err = client.SuppressionPresent(ctxbg, webapi.SuppressionPresentRequest{EmailAddress: "not an address"})
|
||||||
|
terrcode(t, err, "badAddress")
|
||||||
|
|
||||||
|
// SuppressionRemove
|
||||||
|
_, err = client.SuppressionRemove(ctxbg, webapi.SuppressionRemoveRequest{EmailAddress: "remote.LAST+more@☺.LocalHost"})
|
||||||
|
tcheckf(t, err, "remove suppressed address")
|
||||||
|
_, err = client.SuppressionRemove(ctxbg, webapi.SuppressionRemoveRequest{EmailAddress: "remote.LAST+more@☺.LocalHost"})
|
||||||
|
terrcode(t, err, "error") // Absent.
|
||||||
|
_, err = client.SuppressionRemove(ctxbg, webapi.SuppressionRemoveRequest{EmailAddress: "not an address"})
|
||||||
|
terrcode(t, err, "badAddress")
|
||||||
|
|
||||||
|
supListRes, err = client.SuppressionList(ctxbg, webapi.SuppressionListRequest{})
|
||||||
|
tcheckf(t, err, "listing suppressions")
|
||||||
|
tcompare(t, len(supListRes.Suppressions), 0)
|
||||||
|
|
||||||
|
// MessageGet, we retrieve the message we sent first.
|
||||||
|
msgRes, err := client.MessageGet(ctxbg, webapi.MessageGetRequest{MsgID: 1})
|
||||||
|
tcheckf(t, err, "remove suppressed address")
|
||||||
|
sentMsg := sendReq.Message
|
||||||
|
sentMsg.BCC = []webapi.NameAddress{} // todo: the Sent message should contain the BCC. for webmail too.
|
||||||
|
sentMsg.Date = msgRes.Message.Date
|
||||||
|
sentMsg.HTML += "\n"
|
||||||
|
tcompare(t, msgRes.Message, sentMsg)
|
||||||
|
// The structure is: mixed (related (alternative text html) inline-png) attached-pdf).
|
||||||
|
pdfpart := msgRes.Structure.Parts[1]
|
||||||
|
tcompare(t, pdfpart.ContentType, "application/pdf")
|
||||||
|
// structure compared below, parsed again from raw message.
|
||||||
|
// todo: compare Meta
|
||||||
|
|
||||||
|
_, err = client.MessageGet(ctxbg, webapi.MessageGetRequest{MsgID: 1 + 999})
|
||||||
|
terrcode(t, err, "messageNotFound")
|
||||||
|
|
||||||
|
// MessageRawGet
|
||||||
|
r, err := client.MessageRawGet(ctxbg, webapi.MessageRawGetRequest{MsgID: 1})
|
||||||
|
tcheckf(t, err, "get raw message")
|
||||||
|
var b bytes.Buffer
|
||||||
|
_, err = io.Copy(&b, r)
|
||||||
|
r.Close()
|
||||||
|
tcheckf(t, err, "reading raw message")
|
||||||
|
part, err := message.EnsurePart(log.Logger, true, bytes.NewReader(b.Bytes()), int64(b.Len()))
|
||||||
|
tcheckf(t, err, "parsing raw message")
|
||||||
|
tcompare(t, webhook.PartStructure(&part), msgRes.Structure)
|
||||||
|
|
||||||
|
_, err = client.MessageRawGet(ctxbg, webapi.MessageRawGetRequest{MsgID: 1 + 999})
|
||||||
|
terrcode(t, err, "messageNotFound")
|
||||||
|
|
||||||
|
// MessagePartGet
|
||||||
|
// The structure is: mixed (related (alternative text html) inline-png) attached-pdf).
|
||||||
|
r, err = client.MessagePartGet(ctxbg, webapi.MessagePartGetRequest{MsgID: 1, PartPath: []int{0, 0, 1}})
|
||||||
|
tcheckf(t, err, "get message part")
|
||||||
|
tdata(t, r, sendReq.HTML+"\r\n") // Part returns the raw data with \r\n line endings.
|
||||||
|
r.Close()
|
||||||
|
|
||||||
|
r, err = client.MessagePartGet(ctxbg, webapi.MessagePartGetRequest{MsgID: 1, PartPath: []int{}})
|
||||||
|
tcheckf(t, err, "get message part")
|
||||||
|
r.Close()
|
||||||
|
|
||||||
|
_, err = client.MessagePartGet(ctxbg, webapi.MessagePartGetRequest{MsgID: 1, PartPath: []int{2}})
|
||||||
|
terrcode(t, err, "partNotFound")
|
||||||
|
|
||||||
|
_, err = client.MessagePartGet(ctxbg, webapi.MessagePartGetRequest{MsgID: 1 + 999, PartPath: []int{}})
|
||||||
|
terrcode(t, err, "messageNotFound")
|
||||||
|
|
||||||
|
_, err = client.MessageFlagsAdd(ctxbg, webapi.MessageFlagsAddRequest{MsgID: 1, Flags: []string{`\answered`, "$Forwarded", "custom"}})
|
||||||
|
tcheckf(t, err, "add flags")
|
||||||
|
|
||||||
|
msgRes, err = client.MessageGet(ctxbg, webapi.MessageGetRequest{MsgID: 1})
|
||||||
|
tcheckf(t, err, "get message")
|
||||||
|
tcompare(t, slices.Contains(msgRes.Meta.Flags, `\answered`), true)
|
||||||
|
tcompare(t, slices.Contains(msgRes.Meta.Flags, "$forwarded"), true)
|
||||||
|
tcompare(t, slices.Contains(msgRes.Meta.Flags, "custom"), true)
|
||||||
|
|
||||||
|
// Setting duplicate flags doesn't make a change.
|
||||||
|
_, err = client.MessageFlagsAdd(ctxbg, webapi.MessageFlagsAddRequest{MsgID: 1, Flags: []string{`\Answered`, "$forwarded", "custom"}})
|
||||||
|
tcheckf(t, err, "add flags")
|
||||||
|
msgRes2, err := client.MessageGet(ctxbg, webapi.MessageGetRequest{MsgID: 1})
|
||||||
|
tcheckf(t, err, "get message")
|
||||||
|
tcompare(t, msgRes.Meta.Flags, msgRes2.Meta.Flags)
|
||||||
|
|
||||||
|
// Non-existing message gives generic user error.
|
||||||
|
_, err = client.MessageFlagsAdd(ctxbg, webapi.MessageFlagsAddRequest{MsgID: 1 + 999, Flags: []string{`\answered`, "$Forwarded", "custom"}})
|
||||||
|
terrcode(t, err, "messageNotFound")
|
||||||
|
|
||||||
|
// MessageFlagsRemove
|
||||||
|
_, err = client.MessageFlagsRemove(ctxbg, webapi.MessageFlagsRemoveRequest{MsgID: 1, Flags: []string{`\Answered`, "$forwarded", "custom"}})
|
||||||
|
tcheckf(t, err, "remove")
|
||||||
|
msgRes, err = client.MessageGet(ctxbg, webapi.MessageGetRequest{MsgID: 1})
|
||||||
|
tcheckf(t, err, "get message")
|
||||||
|
tcompare(t, slices.Contains(msgRes.Meta.Flags, `\answered`), false)
|
||||||
|
tcompare(t, slices.Contains(msgRes.Meta.Flags, "$forwarded"), false)
|
||||||
|
tcompare(t, slices.Contains(msgRes.Meta.Flags, "custom"), false)
|
||||||
|
// Can try removing again, no change.
|
||||||
|
_, err = client.MessageFlagsRemove(ctxbg, webapi.MessageFlagsRemoveRequest{MsgID: 1, Flags: []string{`\Answered`, "$forwarded", "custom"}})
|
||||||
|
tcheckf(t, err, "remove")
|
||||||
|
|
||||||
|
_, err = client.MessageFlagsRemove(ctxbg, webapi.MessageFlagsRemoveRequest{MsgID: 1 + 999, Flags: []string{`\Answered`, "$forwarded", "custom"}})
|
||||||
|
terrcode(t, err, "messageNotFound")
|
||||||
|
|
||||||
|
// MessageMove
|
||||||
|
tcompare(t, msgRes.Meta.MailboxName, "Sent")
|
||||||
|
_, err = client.MessageMove(ctxbg, webapi.MessageMoveRequest{MsgID: 1, DestMailboxName: "Inbox"})
|
||||||
|
tcheckf(t, err, "move to inbox")
|
||||||
|
msgRes, err = client.MessageGet(ctxbg, webapi.MessageGetRequest{MsgID: 1})
|
||||||
|
tcheckf(t, err, "get message")
|
||||||
|
tcompare(t, msgRes.Meta.MailboxName, "Inbox")
|
||||||
|
_, err = client.MessageMove(ctxbg, webapi.MessageMoveRequest{MsgID: 1, DestMailboxName: "Bogus"})
|
||||||
|
terrcode(t, err, "user")
|
||||||
|
_, err = client.MessageMove(ctxbg, webapi.MessageMoveRequest{MsgID: 1 + 999, DestMailboxName: "Inbox"})
|
||||||
|
terrcode(t, err, "messageNotFound")
|
||||||
|
|
||||||
|
// MessageDelete
|
||||||
|
_, err = client.MessageDelete(ctxbg, webapi.MessageDeleteRequest{MsgID: 1})
|
||||||
|
tcheckf(t, err, "delete message")
|
||||||
|
_, err = client.MessageDelete(ctxbg, webapi.MessageDeleteRequest{MsgID: 1})
|
||||||
|
terrcode(t, err, "user") // No longer.
|
||||||
|
_, err = client.MessageGet(ctxbg, webapi.MessageGetRequest{MsgID: 1})
|
||||||
|
terrcode(t, err, "messageNotFound") // No longer.
|
||||||
|
_, err = client.MessageDelete(ctxbg, webapi.MessageDeleteRequest{MsgID: 1 + 999})
|
||||||
|
terrcode(t, err, "messageNotFound")
|
||||||
|
}
|
||||||
|
|
||||||
|
func tdata(t *testing.T, r io.Reader, exp string) {
|
||||||
|
t.Helper()
|
||||||
|
buf, err := io.ReadAll(r)
|
||||||
|
tcheckf(t, err, "reading body")
|
||||||
|
tcompare(t, string(buf), exp)
|
||||||
|
}
|
|
@ -129,7 +129,7 @@ func Check(ctx context.Context, log mlog.Log, sessionAuth SessionAuth, kind stri
|
||||||
return "", "", "", false
|
return "", "", "", false
|
||||||
}
|
}
|
||||||
|
|
||||||
ip := remoteIP(log, isForwarded, r)
|
ip := RemoteIP(log, isForwarded, r)
|
||||||
if ip == nil {
|
if ip == nil {
|
||||||
respondAuthError("user:noAuth", "cannot find ip for rate limit check (missing x-forwarded-for header?)")
|
respondAuthError("user:noAuth", "cannot find ip for rate limit check (missing x-forwarded-for header?)")
|
||||||
return "", "", "", false
|
return "", "", "", false
|
||||||
|
@ -181,7 +181,7 @@ func Check(ctx context.Context, log mlog.Log, sessionAuth SessionAuth, kind stri
|
||||||
return accountName, sessionToken, loginAddress, true
|
return accountName, sessionToken, loginAddress, true
|
||||||
}
|
}
|
||||||
|
|
||||||
func remoteIP(log mlog.Log, isForwarded bool, r *http.Request) net.IP {
|
func RemoteIP(log mlog.Log, isForwarded bool, r *http.Request) net.IP {
|
||||||
if isForwarded {
|
if isForwarded {
|
||||||
s := r.Header.Get("X-Forwarded-For")
|
s := r.Header.Get("X-Forwarded-For")
|
||||||
ipstr := strings.TrimSpace(strings.Split(s, ",")[0])
|
ipstr := strings.TrimSpace(strings.Split(s, ",")[0])
|
||||||
|
@ -230,7 +230,7 @@ func Login(ctx context.Context, log mlog.Log, sessionAuth SessionAuth, kind, coo
|
||||||
return "", &sherpa.Error{Code: "user:error", Message: "missing login token"}
|
return "", &sherpa.Error{Code: "user:error", Message: "missing login token"}
|
||||||
}
|
}
|
||||||
|
|
||||||
ip := remoteIP(log, isForwarded, r)
|
ip := RemoteIP(log, isForwarded, r)
|
||||||
if ip == nil {
|
if ip == nil {
|
||||||
return "", fmt.Errorf("cannot find ip for rate limit check (missing x-forwarded-for header?)")
|
return "", fmt.Errorf("cannot find ip for rate limit check (missing x-forwarded-for header?)")
|
||||||
}
|
}
|
||||||
|
|
163
webhook/webhook.go
Normal file
163
webhook/webhook.go
Normal file
|
@ -0,0 +1,163 @@
|
||||||
|
// Package webhook has data types used for webhooks about incoming and outgoing deliveries.
|
||||||
|
//
|
||||||
|
// See package webapi for details about the webapi and webhooks.
|
||||||
|
//
|
||||||
|
// Types [Incoming] and [Outgoing] represent the JSON bodies sent in the webhooks.
|
||||||
|
// New fields may be added in the future, unrecognized fields should be ignored
|
||||||
|
// when parsing for forward compatibility.
|
||||||
|
package webhook
|
||||||
|
|
||||||
|
import (
|
||||||
|
"strings"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/mjl-/mox/message"
|
||||||
|
)
|
||||||
|
|
||||||
|
// OutgoingEvent is an activity for an outgoing delivery. Either generated by the
|
||||||
|
// queue, or through an incoming DSN (delivery status notification) message.
|
||||||
|
type OutgoingEvent string
|
||||||
|
|
||||||
|
// note: outgoing hook events are in ../queue/hooks.go, ../mox-/config.go, ../queue.go and ../webapi/gendoc.sh. keep in sync.
|
||||||
|
|
||||||
|
// todo: in future have more events: for spam complaints, perhaps mdn's.
|
||||||
|
|
||||||
|
const (
|
||||||
|
// Message was accepted by a next-hop server. This does not necessarily mean the
|
||||||
|
// message has been delivered in the mailbox of the user.
|
||||||
|
EventDelivered OutgoingEvent = "delivered"
|
||||||
|
|
||||||
|
// Outbound delivery was suppressed because the recipient address is on the
|
||||||
|
// suppression list of the account, or a simplified/base variant of the address is.
|
||||||
|
EventSuppressed OutgoingEvent = "suppressed"
|
||||||
|
|
||||||
|
// A delivery attempt failed but delivery will be retried again later.
|
||||||
|
EventDelayed OutgoingEvent = "delayed"
|
||||||
|
|
||||||
|
// Delivery of the message failed and will not be tried again. Also see the
|
||||||
|
// "Suppressing" field of [Outgoing].
|
||||||
|
EventFailed OutgoingEvent = "failed"
|
||||||
|
|
||||||
|
// Message was relayed into a system that does not generate DSNs. Should only
|
||||||
|
// happen when explicitly requested.
|
||||||
|
EventRelayed OutgoingEvent = "relayed"
|
||||||
|
|
||||||
|
// Message was accepted and is being delivered to multiple recipients (e.g. the
|
||||||
|
// address was an alias/list), which may generate more DSNs.
|
||||||
|
EventExpanded OutgoingEvent = "expanded"
|
||||||
|
|
||||||
|
// Message was removed from the queue, e.g. canceled by admin/user.
|
||||||
|
EventCanceled OutgoingEvent = "canceled"
|
||||||
|
|
||||||
|
// An incoming message was received that was either a DSN with an unknown event
|
||||||
|
// type ("action"), or an incoming non-DSN-message was received for the unique
|
||||||
|
// per-outgoing-message address used for sending.
|
||||||
|
EventUnrecognized OutgoingEvent = "unrecognized"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Outgoing is the payload sent to webhook URLs for events about outgoing deliveries.
|
||||||
|
type Outgoing struct {
|
||||||
|
Version int // Format of hook, currently 0.
|
||||||
|
Event OutgoingEvent // Type of outgoing delivery event.
|
||||||
|
DSN bool // If this event was triggered by a delivery status notification message (DSN).
|
||||||
|
Suppressing bool // If true, this failure caused the address to be added to the suppression list.
|
||||||
|
QueueMsgID int64 // ID of message in queue.
|
||||||
|
FromID string // As used in MAIL FROM, can be empty, for incoming messages.
|
||||||
|
MessageID string // From Message-Id header, as set by submitter or us, with enclosing <>.
|
||||||
|
Subject string // Of original message.
|
||||||
|
WebhookQueued time.Time // When webhook was first queued for delivery.
|
||||||
|
SMTPCode int // Optional, for errors only, e.g. 451, 550. See package smtp for definitions.
|
||||||
|
SMTPEnhancedCode string // Optional, for errors only, e.g. 5.1.1.
|
||||||
|
Error string // Error message while delivering, or from DSN from remote, if any.
|
||||||
|
Extra map[string]string // Extra fields set for message during submit, through webapi call or through X-Mox-Extra-* headers during SMTP submission.
|
||||||
|
}
|
||||||
|
|
||||||
|
// Incoming is the data sent to a webhook for incoming deliveries over SMTP.
|
||||||
|
type Incoming struct {
|
||||||
|
Version int // Format of hook, currently 0.
|
||||||
|
|
||||||
|
// Message "From" header, typically has one address.
|
||||||
|
From []NameAddress
|
||||||
|
|
||||||
|
To []NameAddress
|
||||||
|
CC []NameAddress
|
||||||
|
BCC []NameAddress // Often empty, even if you were a BCC recipient.
|
||||||
|
|
||||||
|
// Optional Reply-To header, typically absent or with one address.
|
||||||
|
ReplyTo []NameAddress
|
||||||
|
|
||||||
|
Subject string
|
||||||
|
|
||||||
|
// Of Message-Id header, typically of the form "<random@hostname>", includes <>.
|
||||||
|
MessageID string
|
||||||
|
|
||||||
|
// Optional, the message-id this message is a reply to. Includes <>.
|
||||||
|
InReplyTo string
|
||||||
|
|
||||||
|
// Optional, zero or more message-ids this message is a reply/forward/related to.
|
||||||
|
// The last entry is the most recent/immediate message this is a reply to. Earlier
|
||||||
|
// entries are the parents in a thread. Values include <>.
|
||||||
|
References []string
|
||||||
|
|
||||||
|
// Time in "Date" message header, can be different from time received.
|
||||||
|
Date *time.Time
|
||||||
|
|
||||||
|
// Contents of text/plain and/or text/html part (if any), with "\n" line-endings,
|
||||||
|
// converted from "\r\n". Values are truncated to 1MB (1024*1024 bytes). Use webapi
|
||||||
|
// MessagePartGet to retrieve the full part data.
|
||||||
|
Text string
|
||||||
|
HTML string
|
||||||
|
// No files, can be large.
|
||||||
|
|
||||||
|
Structure Structure // Parsed form of MIME message.
|
||||||
|
Meta IncomingMeta // Details about message in storage, and SMTP transaction details.
|
||||||
|
}
|
||||||
|
|
||||||
|
type IncomingMeta struct {
|
||||||
|
MsgID int64 // ID of message in storage, and to use in webapi calls like MessageGet.
|
||||||
|
MailFrom string // Address used during SMTP "MAIL FROM" command.
|
||||||
|
MailFromValidated bool // Whether SMTP MAIL FROM address was SPF-validated.
|
||||||
|
MsgFromValidated bool // Whether address in message "From"-header was DMARC(-like) validated.
|
||||||
|
RcptTo string // SMTP RCPT TO address used in SMTP.
|
||||||
|
DKIMVerifiedDomains []string // Verified domains from DKIM-signature in message. Can be different domain than used in addresses.
|
||||||
|
RemoteIP string // Where the message was delivered from.
|
||||||
|
Received time.Time // When message was received, may be different from the Date header.
|
||||||
|
MailboxName string // Mailbox where message was delivered to, based on configured rules. Defaults to "Inbox".
|
||||||
|
|
||||||
|
// Whether this message was automated and should not receive automated replies.
|
||||||
|
// E.g. out of office or mailing list messages.
|
||||||
|
Automated bool
|
||||||
|
}
|
||||||
|
|
||||||
|
type NameAddress struct {
|
||||||
|
Name string // Optional, human-readable "display name" of the addressee.
|
||||||
|
Address string // Required, email address.
|
||||||
|
}
|
||||||
|
|
||||||
|
type Structure struct {
|
||||||
|
ContentType string // Lower case, e.g. text/plain.
|
||||||
|
ContentTypeParams map[string]string // Lower case keys, original case values, e.g. {"charset": "UTF-8"}.
|
||||||
|
ContentID string // Can be empty. Otherwise, should be a value wrapped in <>'s. For use in HTML, referenced as URI `cid:...`.
|
||||||
|
DecodedSize int64 // Size of content after decoding content-transfer-encoding. For text and HTML parts, this can be larger than the data returned since this size includes \r\n line endings.
|
||||||
|
Parts []Structure // Subparts of a multipart message, possibly recursive.
|
||||||
|
}
|
||||||
|
|
||||||
|
// PartStructure returns a Structure for a parsed message part.
|
||||||
|
func PartStructure(p *message.Part) Structure {
|
||||||
|
parts := make([]Structure, len(p.Parts))
|
||||||
|
for i := range p.Parts {
|
||||||
|
parts[i] = PartStructure(&p.Parts[i])
|
||||||
|
}
|
||||||
|
s := Structure{
|
||||||
|
ContentType: strings.ToLower(p.MediaType + "/" + p.MediaSubType),
|
||||||
|
ContentTypeParams: p.ContentTypeParams,
|
||||||
|
ContentID: p.ContentID,
|
||||||
|
DecodedSize: p.DecodedSize,
|
||||||
|
Parts: parts,
|
||||||
|
}
|
||||||
|
// Replace nil map with empty map, for easier to use JSON.
|
||||||
|
if s.ContentTypeParams == nil {
|
||||||
|
s.ContentTypeParams = map[string]string{}
|
||||||
|
}
|
||||||
|
return s
|
||||||
|
}
|
412
webmail/api.go
412
webmail/api.go
|
@ -17,7 +17,7 @@ import (
|
||||||
"net/textproto"
|
"net/textproto"
|
||||||
"os"
|
"os"
|
||||||
"runtime/debug"
|
"runtime/debug"
|
||||||
"sort"
|
"slices"
|
||||||
"strings"
|
"strings"
|
||||||
"sync"
|
"sync"
|
||||||
"time"
|
"time"
|
||||||
|
@ -45,6 +45,7 @@ import (
|
||||||
"github.com/mjl-/mox/smtpclient"
|
"github.com/mjl-/mox/smtpclient"
|
||||||
"github.com/mjl-/mox/store"
|
"github.com/mjl-/mox/store"
|
||||||
"github.com/mjl-/mox/webauth"
|
"github.com/mjl-/mox/webauth"
|
||||||
|
"github.com/mjl-/mox/webops"
|
||||||
)
|
)
|
||||||
|
|
||||||
//go:embed api.json
|
//go:embed api.json
|
||||||
|
@ -266,6 +267,20 @@ func xmessageID(ctx context.Context, tx *bstore.Tx, messageID int64) store.Messa
|
||||||
return m
|
return m
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func xrandomID(ctx context.Context, n int) string {
|
||||||
|
return base64.RawURLEncoding.EncodeToString(xrandom(ctx, n))
|
||||||
|
}
|
||||||
|
|
||||||
|
func xrandom(ctx context.Context, n int) []byte {
|
||||||
|
buf := make([]byte, n)
|
||||||
|
x, err := cryptorand.Read(buf)
|
||||||
|
xcheckf(ctx, err, "read random")
|
||||||
|
if x != n {
|
||||||
|
xcheckf(ctx, errors.New("short random read"), "read random")
|
||||||
|
}
|
||||||
|
return buf
|
||||||
|
}
|
||||||
|
|
||||||
// MessageSubmit sends a message by submitting it the outgoing email queue. The
|
// MessageSubmit sends a message by submitting it the outgoing email queue. The
|
||||||
// message is sent to all addresses listed in the To, Cc and Bcc addresses, without
|
// message is sent to all addresses listed in the To, Cc and Bcc addresses, without
|
||||||
// Bcc message header.
|
// Bcc message header.
|
||||||
|
@ -273,9 +288,9 @@ func xmessageID(ctx context.Context, tx *bstore.Tx, messageID int64) store.Messa
|
||||||
// If a Sent mailbox is configured, messages are added to it after submitting
|
// If a Sent mailbox is configured, messages are added to it after submitting
|
||||||
// to the delivery queue.
|
// to the delivery queue.
|
||||||
func (w Webmail) MessageSubmit(ctx context.Context, m SubmitMessage) {
|
func (w Webmail) MessageSubmit(ctx context.Context, m SubmitMessage) {
|
||||||
// Similar between ../smtpserver/server.go:/submit\( and ../webmail/webmail.go:/MessageSubmit\(
|
// Similar between ../smtpserver/server.go:/submit\( and ../webmail/api.go:/MessageSubmit\( and ../webapisrv/server.go:/Send\(
|
||||||
|
|
||||||
// todo: consider making this an HTTP POST, so we can upload as regular form, which is probably more efficient for encoding for the client and we can stream the data in.
|
// todo: consider making this an HTTP POST, so we can upload as regular form, which is probably more efficient for encoding for the client and we can stream the data in. also not unlike the webapi Submit method.
|
||||||
|
|
||||||
// Prevent any accidental control characters, or attempts at getting bare \r or \n
|
// Prevent any accidental control characters, or attempts at getting bare \r or \n
|
||||||
// into messages.
|
// into messages.
|
||||||
|
@ -358,10 +373,10 @@ func (w Webmail) MessageSubmit(ctx context.Context, m SubmitMessage) {
|
||||||
msglimit, rcptlimit, err := acc.SendLimitReached(tx, rcpts)
|
msglimit, rcptlimit, err := acc.SendLimitReached(tx, rcpts)
|
||||||
if msglimit >= 0 {
|
if msglimit >= 0 {
|
||||||
metricSubmission.WithLabelValues("messagelimiterror").Inc()
|
metricSubmission.WithLabelValues("messagelimiterror").Inc()
|
||||||
xcheckuserf(ctx, errors.New("send message limit reached"), "checking outgoing rate limit")
|
xcheckuserf(ctx, errors.New("message limit reached"), "checking outgoing rate")
|
||||||
} else if rcptlimit >= 0 {
|
} else if rcptlimit >= 0 {
|
||||||
metricSubmission.WithLabelValues("recipientlimiterror").Inc()
|
metricSubmission.WithLabelValues("recipientlimiterror").Inc()
|
||||||
xcheckuserf(ctx, errors.New("send message limit reached"), "checking outgoing rate limit")
|
xcheckuserf(ctx, errors.New("recipient limit reached"), "checking outgoing rate")
|
||||||
}
|
}
|
||||||
xcheckf(ctx, err, "checking send limit")
|
xcheckf(ctx, err, "checking send limit")
|
||||||
})
|
})
|
||||||
|
@ -455,15 +470,19 @@ func (w Webmail) MessageSubmit(ctx context.Context, m SubmitMessage) {
|
||||||
if rp.Envelope == nil {
|
if rp.Envelope == nil {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
xc.Header("In-Reply-To", rp.Envelope.MessageID)
|
|
||||||
ref := h.Get("References")
|
if rp.Envelope.MessageID != "" {
|
||||||
if ref == "" {
|
xc.Header("In-Reply-To", rp.Envelope.MessageID)
|
||||||
ref = h.Get("In-Reply-To")
|
|
||||||
}
|
}
|
||||||
if ref != "" {
|
refs := h.Values("References")
|
||||||
xc.Header("References", ref+"\r\n\t"+rp.Envelope.MessageID)
|
if len(refs) == 0 && rp.Envelope.InReplyTo != "" {
|
||||||
} else {
|
refs = []string{rp.Envelope.InReplyTo}
|
||||||
xc.Header("References", rp.Envelope.MessageID)
|
}
|
||||||
|
if rp.Envelope.MessageID != "" {
|
||||||
|
refs = append(refs, rp.Envelope.MessageID)
|
||||||
|
}
|
||||||
|
if len(refs) > 0 {
|
||||||
|
xc.Header("References", strings.Join(refs, "\r\n\t"))
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
@ -480,7 +499,7 @@ func (w Webmail) MessageSubmit(ctx context.Context, m SubmitMessage) {
|
||||||
xc.Header("Content-Type", fmt.Sprintf(`multipart/mixed; boundary="%s"`, mp.Boundary()))
|
xc.Header("Content-Type", fmt.Sprintf(`multipart/mixed; boundary="%s"`, mp.Boundary()))
|
||||||
xc.Line()
|
xc.Line()
|
||||||
|
|
||||||
textBody, ct, cte := xc.TextPart(m.TextBody)
|
textBody, ct, cte := xc.TextPart("plain", m.TextBody)
|
||||||
textHdr := textproto.MIMEHeader{}
|
textHdr := textproto.MIMEHeader{}
|
||||||
textHdr.Set("Content-Type", ct)
|
textHdr.Set("Content-Type", ct)
|
||||||
textHdr.Set("Content-Transfer-Encoding", cte)
|
textHdr.Set("Content-Transfer-Encoding", cte)
|
||||||
|
@ -601,7 +620,7 @@ func (w Webmail) MessageSubmit(ctx context.Context, m SubmitMessage) {
|
||||||
err = mp.Close()
|
err = mp.Close()
|
||||||
xcheckf(ctx, err, "writing mime multipart")
|
xcheckf(ctx, err, "writing mime multipart")
|
||||||
} else {
|
} else {
|
||||||
textBody, ct, cte := xc.TextPart(m.TextBody)
|
textBody, ct, cte := xc.TextPart("plain", m.TextBody)
|
||||||
xc.Header("Content-Type", ct)
|
xc.Header("Content-Type", ct)
|
||||||
xc.Header("Content-Transfer-Encoding", cte)
|
xc.Header("Content-Transfer-Encoding", cte)
|
||||||
xc.Line()
|
xc.Line()
|
||||||
|
@ -625,13 +644,25 @@ func (w Webmail) MessageSubmit(ctx context.Context, m SubmitMessage) {
|
||||||
msgPrefix = dkimHeaders
|
msgPrefix = dkimHeaders
|
||||||
}
|
}
|
||||||
|
|
||||||
fromPath := smtp.Path{
|
accConf, _ := acc.Conf()
|
||||||
Localpart: fromAddr.Address.Localpart,
|
loginAddr, err := smtp.ParseAddress(reqInfo.LoginAddress)
|
||||||
IPDomain: dns.IPDomain{Domain: fromAddr.Address.Domain},
|
xcheckf(ctx, err, "parsing login address")
|
||||||
|
useFromID := slices.Contains(accConf.ParsedFromIDLoginAddresses, loginAddr)
|
||||||
|
fromPath := fromAddr.Address.Path()
|
||||||
|
var localpartBase string
|
||||||
|
if useFromID {
|
||||||
|
localpartBase = strings.SplitN(string(fromPath.Localpart), confDom.LocalpartCatchallSeparator, 2)[0]
|
||||||
}
|
}
|
||||||
qml := make([]queue.Msg, len(recipients))
|
qml := make([]queue.Msg, len(recipients))
|
||||||
now := time.Now()
|
now := time.Now()
|
||||||
for i, rcpt := range recipients {
|
for i, rcpt := range recipients {
|
||||||
|
fp := fromPath
|
||||||
|
var fromID string
|
||||||
|
if useFromID {
|
||||||
|
fromID = xrandomID(ctx, 16)
|
||||||
|
fp.Localpart = smtp.Localpart(localpartBase + confDom.LocalpartCatchallSeparator + fromID)
|
||||||
|
}
|
||||||
|
|
||||||
// Don't use per-recipient unique message prefix when multiple recipients are
|
// Don't use per-recipient unique message prefix when multiple recipients are
|
||||||
// present, or the queue cannot deliver it in a single smtp transaction.
|
// present, or the queue cannot deliver it in a single smtp transaction.
|
||||||
var recvRcpt string
|
var recvRcpt string
|
||||||
|
@ -644,7 +675,7 @@ func (w Webmail) MessageSubmit(ctx context.Context, m SubmitMessage) {
|
||||||
Localpart: rcpt.Localpart,
|
Localpart: rcpt.Localpart,
|
||||||
IPDomain: dns.IPDomain{Domain: rcpt.Domain},
|
IPDomain: dns.IPDomain{Domain: rcpt.Domain},
|
||||||
}
|
}
|
||||||
qm := queue.MakeMsg(fromPath, toPath, xc.Has8bit, xc.SMTPUTF8, msgSize, messageID, []byte(rcptMsgPrefix), m.RequireTLS, now)
|
qm := queue.MakeMsg(fp, toPath, xc.Has8bit, xc.SMTPUTF8, msgSize, messageID, []byte(rcptMsgPrefix), m.RequireTLS, now, m.Subject)
|
||||||
if m.FutureRelease != nil {
|
if m.FutureRelease != nil {
|
||||||
ival := time.Until(*m.FutureRelease)
|
ival := time.Until(*m.FutureRelease)
|
||||||
if ival < 0 {
|
if ival < 0 {
|
||||||
|
@ -656,6 +687,8 @@ func (w Webmail) MessageSubmit(ctx context.Context, m SubmitMessage) {
|
||||||
qm.FutureReleaseRequest = "until;" + m.FutureRelease.Format(time.RFC3339)
|
qm.FutureReleaseRequest = "until;" + m.FutureRelease.Format(time.RFC3339)
|
||||||
// todo: possibly add a header to the message stored in the Sent mailbox to indicate it was scheduled for later delivery.
|
// todo: possibly add a header to the message stored in the Sent mailbox to indicate it was scheduled for later delivery.
|
||||||
}
|
}
|
||||||
|
qm.FromID = fromID
|
||||||
|
// no qm.Extra from webmail
|
||||||
qml[i] = qm
|
qml[i] = qm
|
||||||
}
|
}
|
||||||
err = queue.Add(ctx, log, reqInfo.AccountName, dataFile, qml...)
|
err = queue.Add(ctx, log, reqInfo.AccountName, dataFile, qml...)
|
||||||
|
@ -764,124 +797,13 @@ func (Webmail) MessageMove(ctx context.Context, messageIDs []int64, mailboxID in
|
||||||
log.Check(err, "closing account")
|
log.Check(err, "closing account")
|
||||||
}()
|
}()
|
||||||
|
|
||||||
acc.WithRLock(func() {
|
xops.MessageMove(ctx, log, acc, messageIDs, "", mailboxID)
|
||||||
retrain := make([]store.Message, 0, len(messageIDs))
|
}
|
||||||
removeChanges := map[int64]store.ChangeRemoveUIDs{}
|
|
||||||
// n adds, 1 remove, 2 mailboxcounts, optimistic and at least for a single message.
|
|
||||||
changes := make([]store.Change, 0, len(messageIDs)+3)
|
|
||||||
|
|
||||||
xdbwrite(ctx, acc, func(tx *bstore.Tx) {
|
var xops = webops.XOps{
|
||||||
var mbSrc store.Mailbox
|
DBWrite: xdbwrite,
|
||||||
var modseq store.ModSeq
|
Checkf: xcheckf,
|
||||||
|
Checkuserf: xcheckuserf,
|
||||||
mbDst := xmailboxID(ctx, tx, mailboxID)
|
|
||||||
|
|
||||||
if len(messageIDs) == 0 {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
keywords := map[string]struct{}{}
|
|
||||||
|
|
||||||
for _, mid := range messageIDs {
|
|
||||||
m := xmessageID(ctx, tx, mid)
|
|
||||||
|
|
||||||
// We may have loaded this mailbox in the previous iteration of this loop.
|
|
||||||
if m.MailboxID != mbSrc.ID {
|
|
||||||
if mbSrc.ID != 0 {
|
|
||||||
err = tx.Update(&mbSrc)
|
|
||||||
xcheckf(ctx, err, "updating source mailbox counts")
|
|
||||||
changes = append(changes, mbSrc.ChangeCounts())
|
|
||||||
}
|
|
||||||
mbSrc = xmailboxID(ctx, tx, m.MailboxID)
|
|
||||||
}
|
|
||||||
|
|
||||||
if mbSrc.ID == mailboxID {
|
|
||||||
// Client should filter out messages that are already in mailbox.
|
|
||||||
xcheckuserf(ctx, errors.New("already in destination mailbox"), "moving message")
|
|
||||||
}
|
|
||||||
|
|
||||||
if modseq == 0 {
|
|
||||||
modseq, err = acc.NextModSeq(tx)
|
|
||||||
xcheckf(ctx, err, "assigning next modseq")
|
|
||||||
}
|
|
||||||
|
|
||||||
ch := removeChanges[m.MailboxID]
|
|
||||||
ch.UIDs = append(ch.UIDs, m.UID)
|
|
||||||
ch.ModSeq = modseq
|
|
||||||
ch.MailboxID = m.MailboxID
|
|
||||||
removeChanges[m.MailboxID] = ch
|
|
||||||
|
|
||||||
// Copy of message record that we'll insert when UID is freed up.
|
|
||||||
om := m
|
|
||||||
om.PrepareExpunge()
|
|
||||||
om.ID = 0 // Assign new ID.
|
|
||||||
om.ModSeq = modseq
|
|
||||||
|
|
||||||
mbSrc.Sub(m.MailboxCounts())
|
|
||||||
|
|
||||||
if mbDst.Trash {
|
|
||||||
m.Seen = true
|
|
||||||
}
|
|
||||||
conf, _ := acc.Conf()
|
|
||||||
m.MailboxID = mbDst.ID
|
|
||||||
if m.IsReject && m.MailboxDestinedID != 0 {
|
|
||||||
// Incorrectly delivered to Rejects mailbox. Adjust MailboxOrigID so this message
|
|
||||||
// is used for reputation calculation during future deliveries.
|
|
||||||
m.MailboxOrigID = m.MailboxDestinedID
|
|
||||||
m.IsReject = false
|
|
||||||
m.Seen = false
|
|
||||||
}
|
|
||||||
m.UID = mbDst.UIDNext
|
|
||||||
m.ModSeq = modseq
|
|
||||||
mbDst.UIDNext++
|
|
||||||
m.JunkFlagsForMailbox(mbDst, conf)
|
|
||||||
err = tx.Update(&m)
|
|
||||||
xcheckf(ctx, err, "updating moved message in database")
|
|
||||||
|
|
||||||
// Now that UID is unused, we can insert the old record again.
|
|
||||||
err = tx.Insert(&om)
|
|
||||||
xcheckf(ctx, err, "inserting record for expunge after moving message")
|
|
||||||
|
|
||||||
mbDst.Add(m.MailboxCounts())
|
|
||||||
|
|
||||||
changes = append(changes, m.ChangeAddUID())
|
|
||||||
retrain = append(retrain, m)
|
|
||||||
|
|
||||||
for _, kw := range m.Keywords {
|
|
||||||
keywords[kw] = struct{}{}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
err = tx.Update(&mbSrc)
|
|
||||||
xcheckf(ctx, err, "updating source mailbox counts")
|
|
||||||
|
|
||||||
changes = append(changes, mbSrc.ChangeCounts(), mbDst.ChangeCounts())
|
|
||||||
|
|
||||||
// Ensure destination mailbox has keywords of the moved messages.
|
|
||||||
var mbKwChanged bool
|
|
||||||
mbDst.Keywords, mbKwChanged = store.MergeKeywords(mbDst.Keywords, maps.Keys(keywords))
|
|
||||||
if mbKwChanged {
|
|
||||||
changes = append(changes, mbDst.ChangeKeywords())
|
|
||||||
}
|
|
||||||
|
|
||||||
err = tx.Update(&mbDst)
|
|
||||||
xcheckf(ctx, err, "updating mailbox with uidnext")
|
|
||||||
|
|
||||||
err = acc.RetrainMessages(ctx, log, tx, retrain, false)
|
|
||||||
xcheckf(ctx, err, "retraining messages after move")
|
|
||||||
})
|
|
||||||
|
|
||||||
// Ensure UIDs of the removed message are in increasing order. It is quite common
|
|
||||||
// for all messages to be from a single source mailbox, meaning this is just one
|
|
||||||
// change, for which we preallocated space.
|
|
||||||
for _, ch := range removeChanges {
|
|
||||||
sort.Slice(ch.UIDs, func(i, j int) bool {
|
|
||||||
return ch.UIDs[i] < ch.UIDs[j]
|
|
||||||
})
|
|
||||||
changes = append(changes, ch)
|
|
||||||
}
|
|
||||||
store.BroadcastChanges(acc, changes)
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// MessageDelete permanently deletes messages, without moving them to the Trash mailbox.
|
// MessageDelete permanently deletes messages, without moving them to the Trash mailbox.
|
||||||
|
@ -899,86 +821,7 @@ func (Webmail) MessageDelete(ctx context.Context, messageIDs []int64) {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
acc.WithWLock(func() {
|
xops.MessageDelete(ctx, log, acc, messageIDs)
|
||||||
removeChanges := map[int64]store.ChangeRemoveUIDs{}
|
|
||||||
changes := make([]store.Change, 0, len(messageIDs)+1) // n remove, 1 mailbox counts
|
|
||||||
|
|
||||||
xdbwrite(ctx, acc, func(tx *bstore.Tx) {
|
|
||||||
var modseq store.ModSeq
|
|
||||||
var mb store.Mailbox
|
|
||||||
remove := make([]store.Message, 0, len(messageIDs))
|
|
||||||
|
|
||||||
var totalSize int64
|
|
||||||
for _, mid := range messageIDs {
|
|
||||||
m := xmessageID(ctx, tx, mid)
|
|
||||||
totalSize += m.Size
|
|
||||||
|
|
||||||
if m.MailboxID != mb.ID {
|
|
||||||
if mb.ID != 0 {
|
|
||||||
err := tx.Update(&mb)
|
|
||||||
xcheckf(ctx, err, "updating mailbox counts")
|
|
||||||
changes = append(changes, mb.ChangeCounts())
|
|
||||||
}
|
|
||||||
mb = xmailboxID(ctx, tx, m.MailboxID)
|
|
||||||
}
|
|
||||||
|
|
||||||
qmr := bstore.QueryTx[store.Recipient](tx)
|
|
||||||
qmr.FilterEqual("MessageID", m.ID)
|
|
||||||
_, err = qmr.Delete()
|
|
||||||
xcheckf(ctx, err, "removing message recipients")
|
|
||||||
|
|
||||||
mb.Sub(m.MailboxCounts())
|
|
||||||
|
|
||||||
if modseq == 0 {
|
|
||||||
modseq, err = acc.NextModSeq(tx)
|
|
||||||
xcheckf(ctx, err, "assigning next modseq")
|
|
||||||
}
|
|
||||||
m.Expunged = true
|
|
||||||
m.ModSeq = modseq
|
|
||||||
err = tx.Update(&m)
|
|
||||||
xcheckf(ctx, err, "marking message as expunged")
|
|
||||||
|
|
||||||
ch := removeChanges[m.MailboxID]
|
|
||||||
ch.UIDs = append(ch.UIDs, m.UID)
|
|
||||||
ch.MailboxID = m.MailboxID
|
|
||||||
ch.ModSeq = modseq
|
|
||||||
removeChanges[m.MailboxID] = ch
|
|
||||||
remove = append(remove, m)
|
|
||||||
}
|
|
||||||
|
|
||||||
if mb.ID != 0 {
|
|
||||||
err := tx.Update(&mb)
|
|
||||||
xcheckf(ctx, err, "updating count in mailbox")
|
|
||||||
changes = append(changes, mb.ChangeCounts())
|
|
||||||
}
|
|
||||||
|
|
||||||
err = acc.AddMessageSize(log, tx, -totalSize)
|
|
||||||
xcheckf(ctx, err, "updating disk usage")
|
|
||||||
|
|
||||||
// Mark removed messages as not needing training, then retrain them, so if they
|
|
||||||
// were trained, they get untrained.
|
|
||||||
for i := range remove {
|
|
||||||
remove[i].Junk = false
|
|
||||||
remove[i].Notjunk = false
|
|
||||||
}
|
|
||||||
err = acc.RetrainMessages(ctx, log, tx, remove, true)
|
|
||||||
xcheckf(ctx, err, "untraining deleted messages")
|
|
||||||
})
|
|
||||||
|
|
||||||
for _, ch := range removeChanges {
|
|
||||||
sort.Slice(ch.UIDs, func(i, j int) bool {
|
|
||||||
return ch.UIDs[i] < ch.UIDs[j]
|
|
||||||
})
|
|
||||||
changes = append(changes, ch)
|
|
||||||
}
|
|
||||||
store.BroadcastChanges(acc, changes)
|
|
||||||
})
|
|
||||||
|
|
||||||
for _, mID := range messageIDs {
|
|
||||||
p := acc.MessagePath(mID)
|
|
||||||
err := os.Remove(p)
|
|
||||||
log.Check(err, "removing message file for expunge")
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// FlagsAdd adds flags, either system flags like \Seen or custom keywords. The
|
// FlagsAdd adds flags, either system flags like \Seen or custom keywords. The
|
||||||
|
@ -993,76 +836,7 @@ func (Webmail) FlagsAdd(ctx context.Context, messageIDs []int64, flaglist []stri
|
||||||
log.Check(err, "closing account")
|
log.Check(err, "closing account")
|
||||||
}()
|
}()
|
||||||
|
|
||||||
flags, keywords, err := store.ParseFlagsKeywords(flaglist)
|
xops.MessageFlagsAdd(ctx, log, acc, messageIDs, flaglist)
|
||||||
xcheckuserf(ctx, err, "parsing flags")
|
|
||||||
|
|
||||||
acc.WithRLock(func() {
|
|
||||||
var changes []store.Change
|
|
||||||
|
|
||||||
xdbwrite(ctx, acc, func(tx *bstore.Tx) {
|
|
||||||
var modseq store.ModSeq
|
|
||||||
var retrain []store.Message
|
|
||||||
var mb, origmb store.Mailbox
|
|
||||||
|
|
||||||
for _, mid := range messageIDs {
|
|
||||||
m := xmessageID(ctx, tx, mid)
|
|
||||||
|
|
||||||
if mb.ID != m.MailboxID {
|
|
||||||
if mb.ID != 0 {
|
|
||||||
err := tx.Update(&mb)
|
|
||||||
xcheckf(ctx, err, "updating mailbox")
|
|
||||||
if mb.MailboxCounts != origmb.MailboxCounts {
|
|
||||||
changes = append(changes, mb.ChangeCounts())
|
|
||||||
}
|
|
||||||
if mb.KeywordsChanged(origmb) {
|
|
||||||
changes = append(changes, mb.ChangeKeywords())
|
|
||||||
}
|
|
||||||
}
|
|
||||||
mb = xmailboxID(ctx, tx, m.MailboxID)
|
|
||||||
origmb = mb
|
|
||||||
}
|
|
||||||
mb.Keywords, _ = store.MergeKeywords(mb.Keywords, keywords)
|
|
||||||
|
|
||||||
mb.Sub(m.MailboxCounts())
|
|
||||||
oflags := m.Flags
|
|
||||||
m.Flags = m.Flags.Set(flags, flags)
|
|
||||||
var kwChanged bool
|
|
||||||
m.Keywords, kwChanged = store.MergeKeywords(m.Keywords, keywords)
|
|
||||||
mb.Add(m.MailboxCounts())
|
|
||||||
|
|
||||||
if m.Flags == oflags && !kwChanged {
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
|
|
||||||
if modseq == 0 {
|
|
||||||
modseq, err = acc.NextModSeq(tx)
|
|
||||||
xcheckf(ctx, err, "assigning next modseq")
|
|
||||||
}
|
|
||||||
m.ModSeq = modseq
|
|
||||||
err = tx.Update(&m)
|
|
||||||
xcheckf(ctx, err, "updating message")
|
|
||||||
|
|
||||||
changes = append(changes, m.ChangeFlags(oflags))
|
|
||||||
retrain = append(retrain, m)
|
|
||||||
}
|
|
||||||
|
|
||||||
if mb.ID != 0 {
|
|
||||||
err := tx.Update(&mb)
|
|
||||||
xcheckf(ctx, err, "updating mailbox")
|
|
||||||
if mb.MailboxCounts != origmb.MailboxCounts {
|
|
||||||
changes = append(changes, mb.ChangeCounts())
|
|
||||||
}
|
|
||||||
if mb.KeywordsChanged(origmb) {
|
|
||||||
changes = append(changes, mb.ChangeKeywords())
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
err = acc.RetrainMessages(ctx, log, tx, retrain, false)
|
|
||||||
xcheckf(ctx, err, "retraining messages")
|
|
||||||
})
|
|
||||||
|
|
||||||
store.BroadcastChanges(acc, changes)
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// FlagsClear clears flags, either system flags like \Seen or custom keywords.
|
// FlagsClear clears flags, either system flags like \Seen or custom keywords.
|
||||||
|
@ -1076,71 +850,7 @@ func (Webmail) FlagsClear(ctx context.Context, messageIDs []int64, flaglist []st
|
||||||
log.Check(err, "closing account")
|
log.Check(err, "closing account")
|
||||||
}()
|
}()
|
||||||
|
|
||||||
flags, keywords, err := store.ParseFlagsKeywords(flaglist)
|
xops.MessageFlagsClear(ctx, log, acc, messageIDs, flaglist)
|
||||||
xcheckuserf(ctx, err, "parsing flags")
|
|
||||||
|
|
||||||
acc.WithRLock(func() {
|
|
||||||
var retrain []store.Message
|
|
||||||
var changes []store.Change
|
|
||||||
|
|
||||||
xdbwrite(ctx, acc, func(tx *bstore.Tx) {
|
|
||||||
var modseq store.ModSeq
|
|
||||||
var mb, origmb store.Mailbox
|
|
||||||
|
|
||||||
for _, mid := range messageIDs {
|
|
||||||
m := xmessageID(ctx, tx, mid)
|
|
||||||
|
|
||||||
if mb.ID != m.MailboxID {
|
|
||||||
if mb.ID != 0 {
|
|
||||||
err := tx.Update(&mb)
|
|
||||||
xcheckf(ctx, err, "updating counts for mailbox")
|
|
||||||
if mb.MailboxCounts != origmb.MailboxCounts {
|
|
||||||
changes = append(changes, mb.ChangeCounts())
|
|
||||||
}
|
|
||||||
// note: cannot remove keywords from mailbox by removing keywords from message.
|
|
||||||
}
|
|
||||||
mb = xmailboxID(ctx, tx, m.MailboxID)
|
|
||||||
origmb = mb
|
|
||||||
}
|
|
||||||
|
|
||||||
oflags := m.Flags
|
|
||||||
mb.Sub(m.MailboxCounts())
|
|
||||||
m.Flags = m.Flags.Set(flags, store.Flags{})
|
|
||||||
var changed bool
|
|
||||||
m.Keywords, changed = store.RemoveKeywords(m.Keywords, keywords)
|
|
||||||
mb.Add(m.MailboxCounts())
|
|
||||||
|
|
||||||
if m.Flags == oflags && !changed {
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
|
|
||||||
if modseq == 0 {
|
|
||||||
modseq, err = acc.NextModSeq(tx)
|
|
||||||
xcheckf(ctx, err, "assigning next modseq")
|
|
||||||
}
|
|
||||||
m.ModSeq = modseq
|
|
||||||
err = tx.Update(&m)
|
|
||||||
xcheckf(ctx, err, "updating message")
|
|
||||||
|
|
||||||
changes = append(changes, m.ChangeFlags(oflags))
|
|
||||||
retrain = append(retrain, m)
|
|
||||||
}
|
|
||||||
|
|
||||||
if mb.ID != 0 {
|
|
||||||
err := tx.Update(&mb)
|
|
||||||
xcheckf(ctx, err, "updating keywords in mailbox")
|
|
||||||
if mb.MailboxCounts != origmb.MailboxCounts {
|
|
||||||
changes = append(changes, mb.ChangeCounts())
|
|
||||||
}
|
|
||||||
// note: cannot remove keywords from mailbox by removing keywords from message.
|
|
||||||
}
|
|
||||||
|
|
||||||
err = acc.RetrainMessages(ctx, log, tx, retrain, false)
|
|
||||||
xcheckf(ctx, err, "retraining messages")
|
|
||||||
})
|
|
||||||
|
|
||||||
store.BroadcastChanges(acc, changes)
|
|
||||||
})
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// MailboxCreate creates a new mailbox.
|
// MailboxCreate creates a new mailbox.
|
||||||
|
|
|
@ -982,14 +982,14 @@
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"Name": "InReplyTo",
|
"Name": "InReplyTo",
|
||||||
"Docs": "",
|
"Docs": "From In-Reply-To header, includes \u003c\u003e.",
|
||||||
"Typewords": [
|
"Typewords": [
|
||||||
"string"
|
"string"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"Name": "MessageID",
|
"Name": "MessageID",
|
||||||
"Docs": "",
|
"Docs": "From Message-Id header, includes \u003c\u003e.",
|
||||||
"Typewords": [
|
"Typewords": [
|
||||||
"string"
|
"string"
|
||||||
]
|
]
|
||||||
|
@ -1918,7 +1918,7 @@
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"Name": "DSN",
|
"Name": "DSN",
|
||||||
"Docs": "If this message is a DSN. For DSNs, we don't look at the subject when matching threads.",
|
"Docs": "If this message is a DSN, generated by us or received. For DSNs, we don't look at the subject when matching threads.",
|
||||||
"Typewords": [
|
"Typewords": [
|
||||||
"bool"
|
"bool"
|
||||||
]
|
]
|
||||||
|
|
|
@ -99,8 +99,8 @@ export interface Envelope {
|
||||||
To?: Address[] | null
|
To?: Address[] | null
|
||||||
CC?: Address[] | null
|
CC?: Address[] | null
|
||||||
BCC?: Address[] | null
|
BCC?: Address[] | null
|
||||||
InReplyTo: string
|
InReplyTo: string // From In-Reply-To header, includes <>.
|
||||||
MessageID: string
|
MessageID: string // From Message-Id header, includes <>.
|
||||||
}
|
}
|
||||||
|
|
||||||
// Address as used in From and To headers.
|
// Address as used in From and To headers.
|
||||||
|
@ -302,7 +302,7 @@ export interface Message {
|
||||||
ThreadMuted: boolean // If set, newly delivered child messages are automatically marked as read. This field is copied to new child messages. Changes are propagated to the webmail client.
|
ThreadMuted: boolean // If set, newly delivered child messages are automatically marked as read. This field is copied to new child messages. Changes are propagated to the webmail client.
|
||||||
ThreadCollapsed: boolean // If set, this (sub)thread is collapsed in the webmail client, for threading mode "on" (mode "unread" ignores it). This field is copied to new child message. Changes are propagated to the webmail client.
|
ThreadCollapsed: boolean // If set, this (sub)thread is collapsed in the webmail client, for threading mode "on" (mode "unread" ignores it). This field is copied to new child message. Changes are propagated to the webmail client.
|
||||||
IsMailingList: boolean // If received message was known to match a mailing list rule (with modified junk filtering).
|
IsMailingList: boolean // If received message was known to match a mailing list rule (with modified junk filtering).
|
||||||
DSN: boolean // If this message is a DSN. For DSNs, we don't look at the subject when matching threads.
|
DSN: boolean // If this message is a DSN, generated by us or received. For DSNs, we don't look at the subject when matching threads.
|
||||||
ReceivedTLSVersion: number // 0 if unknown, 1 if plaintext/no TLS, otherwise TLS cipher suite.
|
ReceivedTLSVersion: number // 0 if unknown, 1 if plaintext/no TLS, otherwise TLS cipher suite.
|
||||||
ReceivedTLSCipherSuite: number
|
ReceivedTLSCipherSuite: number
|
||||||
ReceivedRequireTLS: boolean // Whether RequireTLS was known to be used for incoming delivery.
|
ReceivedRequireTLS: boolean // Whether RequireTLS was known to be used for incoming delivery.
|
||||||
|
|
|
@ -274,6 +274,8 @@ func TestAPI(t *testing.T) {
|
||||||
|
|
||||||
// MessageSubmit
|
// MessageSubmit
|
||||||
queue.Localserve = true // Deliver directly to us instead attempting actual delivery.
|
queue.Localserve = true // Deliver directly to us instead attempting actual delivery.
|
||||||
|
err = queue.Init()
|
||||||
|
tcheck(t, err, "queue init")
|
||||||
api.MessageSubmit(ctx, SubmitMessage{
|
api.MessageSubmit(ctx, SubmitMessage{
|
||||||
From: "mjl@mox.example",
|
From: "mjl@mox.example",
|
||||||
To: []string{"mjl+to@mox.example", "mjl to2 <mjl+to2@mox.example>"},
|
To: []string{"mjl+to@mox.example", "mjl to2 <mjl+to2@mox.example>"},
|
||||||
|
|
|
@ -220,6 +220,7 @@ const [dom, style, attr, prop] = (function () {
|
||||||
autocomplete: (s) => _attr('autocomplete', s),
|
autocomplete: (s) => _attr('autocomplete', s),
|
||||||
list: (s) => _attr('list', s),
|
list: (s) => _attr('list', s),
|
||||||
form: (s) => _attr('form', s),
|
form: (s) => _attr('form', s),
|
||||||
|
size: (s) => _attr('size', s),
|
||||||
};
|
};
|
||||||
const style = (x) => { return { _styles: x }; };
|
const style = (x) => { return { _styles: x }; };
|
||||||
const prop = (x) => { return { _props: x }; };
|
const prop = (x) => { return { _props: x }; };
|
||||||
|
|
|
@ -220,6 +220,7 @@ const [dom, style, attr, prop] = (function () {
|
||||||
autocomplete: (s) => _attr('autocomplete', s),
|
autocomplete: (s) => _attr('autocomplete', s),
|
||||||
list: (s) => _attr('list', s),
|
list: (s) => _attr('list', s),
|
||||||
form: (s) => _attr('form', s),
|
form: (s) => _attr('form', s),
|
||||||
|
size: (s) => _attr('size', s),
|
||||||
};
|
};
|
||||||
const style = (x) => { return { _styles: x }; };
|
const style = (x) => { return { _styles: x }; };
|
||||||
const prop = (x) => { return { _props: x }; };
|
const prop = (x) => { return { _props: x }; };
|
||||||
|
|
|
@ -77,7 +77,7 @@ var webmailtextHTML []byte
|
||||||
var webmailtextJS []byte
|
var webmailtextJS []byte
|
||||||
|
|
||||||
var (
|
var (
|
||||||
// Similar between ../webmail/webmail.go:/metricSubmission and ../smtpserver/server.go:/metricSubmission
|
// Similar between ../webmail/webmail.go:/metricSubmission and ../smtpserver/server.go:/metricSubmission and ../webapisrv/server.go:/metricSubmission
|
||||||
metricSubmission = promauto.NewCounterVec(
|
metricSubmission = promauto.NewCounterVec(
|
||||||
prometheus.CounterOpts{
|
prometheus.CounterOpts{
|
||||||
Name: "mox_webmail_submission_total",
|
Name: "mox_webmail_submission_total",
|
||||||
|
|
|
@ -220,6 +220,7 @@ const [dom, style, attr, prop] = (function () {
|
||||||
autocomplete: (s) => _attr('autocomplete', s),
|
autocomplete: (s) => _attr('autocomplete', s),
|
||||||
list: (s) => _attr('list', s),
|
list: (s) => _attr('list', s),
|
||||||
form: (s) => _attr('form', s),
|
form: (s) => _attr('form', s),
|
||||||
|
size: (s) => _attr('size', s),
|
||||||
};
|
};
|
||||||
const style = (x) => { return { _styles: x }; };
|
const style = (x) => { return { _styles: x }; };
|
||||||
const prop = (x) => { return { _props: x }; };
|
const prop = (x) => { return { _props: x }; };
|
||||||
|
|
480
webops/xops.go
Normal file
480
webops/xops.go
Normal file
|
@ -0,0 +1,480 @@
|
||||||
|
// Package webops implements shared functionality between webapisrv and webmail.
|
||||||
|
package webops
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"os"
|
||||||
|
"sort"
|
||||||
|
|
||||||
|
"golang.org/x/exp/maps"
|
||||||
|
|
||||||
|
"github.com/mjl-/bstore"
|
||||||
|
|
||||||
|
"github.com/mjl-/mox/message"
|
||||||
|
"github.com/mjl-/mox/mlog"
|
||||||
|
"github.com/mjl-/mox/store"
|
||||||
|
)
|
||||||
|
|
||||||
|
var ErrMessageNotFound = errors.New("no such message")
|
||||||
|
|
||||||
|
type XOps struct {
|
||||||
|
DBWrite func(ctx context.Context, acc *store.Account, fn func(tx *bstore.Tx))
|
||||||
|
Checkf func(ctx context.Context, err error, format string, args ...any)
|
||||||
|
Checkuserf func(ctx context.Context, err error, format string, args ...any)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (x XOps) mailboxID(ctx context.Context, tx *bstore.Tx, mailboxID int64) store.Mailbox {
|
||||||
|
if mailboxID == 0 {
|
||||||
|
x.Checkuserf(ctx, errors.New("invalid zero mailbox ID"), "getting mailbox")
|
||||||
|
}
|
||||||
|
mb := store.Mailbox{ID: mailboxID}
|
||||||
|
err := tx.Get(&mb)
|
||||||
|
if err == bstore.ErrAbsent {
|
||||||
|
x.Checkuserf(ctx, err, "getting mailbox")
|
||||||
|
}
|
||||||
|
x.Checkf(ctx, err, "getting mailbox")
|
||||||
|
return mb
|
||||||
|
}
|
||||||
|
|
||||||
|
// messageID returns a non-expunged message or panics with a sherpa error.
|
||||||
|
func (x XOps) messageID(ctx context.Context, tx *bstore.Tx, messageID int64) store.Message {
|
||||||
|
if messageID == 0 {
|
||||||
|
x.Checkuserf(ctx, errors.New("invalid zero message id"), "getting message")
|
||||||
|
}
|
||||||
|
m := store.Message{ID: messageID}
|
||||||
|
err := tx.Get(&m)
|
||||||
|
if err == bstore.ErrAbsent {
|
||||||
|
x.Checkuserf(ctx, ErrMessageNotFound, "getting message")
|
||||||
|
} else if err == nil && m.Expunged {
|
||||||
|
x.Checkuserf(ctx, errors.New("message was removed"), "getting message")
|
||||||
|
}
|
||||||
|
x.Checkf(ctx, err, "getting message")
|
||||||
|
return m
|
||||||
|
}
|
||||||
|
|
||||||
|
func (x XOps) MessageDelete(ctx context.Context, log mlog.Log, acc *store.Account, messageIDs []int64) {
|
||||||
|
acc.WithWLock(func() {
|
||||||
|
removeChanges := map[int64]store.ChangeRemoveUIDs{}
|
||||||
|
changes := make([]store.Change, 0, len(messageIDs)+1) // n remove, 1 mailbox counts
|
||||||
|
|
||||||
|
x.DBWrite(ctx, acc, func(tx *bstore.Tx) {
|
||||||
|
var modseq store.ModSeq
|
||||||
|
var mb store.Mailbox
|
||||||
|
remove := make([]store.Message, 0, len(messageIDs))
|
||||||
|
|
||||||
|
var totalSize int64
|
||||||
|
for _, mid := range messageIDs {
|
||||||
|
m := x.messageID(ctx, tx, mid)
|
||||||
|
totalSize += m.Size
|
||||||
|
|
||||||
|
if m.MailboxID != mb.ID {
|
||||||
|
if mb.ID != 0 {
|
||||||
|
err := tx.Update(&mb)
|
||||||
|
x.Checkf(ctx, err, "updating mailbox counts")
|
||||||
|
changes = append(changes, mb.ChangeCounts())
|
||||||
|
}
|
||||||
|
mb = x.mailboxID(ctx, tx, m.MailboxID)
|
||||||
|
}
|
||||||
|
|
||||||
|
qmr := bstore.QueryTx[store.Recipient](tx)
|
||||||
|
qmr.FilterEqual("MessageID", m.ID)
|
||||||
|
_, err := qmr.Delete()
|
||||||
|
x.Checkf(ctx, err, "removing message recipients")
|
||||||
|
|
||||||
|
mb.Sub(m.MailboxCounts())
|
||||||
|
|
||||||
|
if modseq == 0 {
|
||||||
|
modseq, err = acc.NextModSeq(tx)
|
||||||
|
x.Checkf(ctx, err, "assigning next modseq")
|
||||||
|
}
|
||||||
|
m.Expunged = true
|
||||||
|
m.ModSeq = modseq
|
||||||
|
err = tx.Update(&m)
|
||||||
|
x.Checkf(ctx, err, "marking message as expunged")
|
||||||
|
|
||||||
|
ch := removeChanges[m.MailboxID]
|
||||||
|
ch.UIDs = append(ch.UIDs, m.UID)
|
||||||
|
ch.MailboxID = m.MailboxID
|
||||||
|
ch.ModSeq = modseq
|
||||||
|
removeChanges[m.MailboxID] = ch
|
||||||
|
remove = append(remove, m)
|
||||||
|
}
|
||||||
|
|
||||||
|
if mb.ID != 0 {
|
||||||
|
err := tx.Update(&mb)
|
||||||
|
x.Checkf(ctx, err, "updating count in mailbox")
|
||||||
|
changes = append(changes, mb.ChangeCounts())
|
||||||
|
}
|
||||||
|
|
||||||
|
err := acc.AddMessageSize(log, tx, -totalSize)
|
||||||
|
x.Checkf(ctx, err, "updating disk usage")
|
||||||
|
|
||||||
|
// Mark removed messages as not needing training, then retrain them, so if they
|
||||||
|
// were trained, they get untrained.
|
||||||
|
for i := range remove {
|
||||||
|
remove[i].Junk = false
|
||||||
|
remove[i].Notjunk = false
|
||||||
|
}
|
||||||
|
err = acc.RetrainMessages(ctx, log, tx, remove, true)
|
||||||
|
x.Checkf(ctx, err, "untraining deleted messages")
|
||||||
|
})
|
||||||
|
|
||||||
|
for _, ch := range removeChanges {
|
||||||
|
sort.Slice(ch.UIDs, func(i, j int) bool {
|
||||||
|
return ch.UIDs[i] < ch.UIDs[j]
|
||||||
|
})
|
||||||
|
changes = append(changes, ch)
|
||||||
|
}
|
||||||
|
store.BroadcastChanges(acc, changes)
|
||||||
|
})
|
||||||
|
|
||||||
|
for _, mID := range messageIDs {
|
||||||
|
p := acc.MessagePath(mID)
|
||||||
|
err := os.Remove(p)
|
||||||
|
log.Check(err, "removing message file for expunge")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (x XOps) MessageFlagsAdd(ctx context.Context, log mlog.Log, acc *store.Account, messageIDs []int64, flaglist []string) {
|
||||||
|
flags, keywords, err := store.ParseFlagsKeywords(flaglist)
|
||||||
|
x.Checkuserf(ctx, err, "parsing flags")
|
||||||
|
|
||||||
|
acc.WithRLock(func() {
|
||||||
|
var changes []store.Change
|
||||||
|
|
||||||
|
x.DBWrite(ctx, acc, func(tx *bstore.Tx) {
|
||||||
|
var modseq store.ModSeq
|
||||||
|
var retrain []store.Message
|
||||||
|
var mb, origmb store.Mailbox
|
||||||
|
|
||||||
|
for _, mid := range messageIDs {
|
||||||
|
m := x.messageID(ctx, tx, mid)
|
||||||
|
|
||||||
|
if mb.ID != m.MailboxID {
|
||||||
|
if mb.ID != 0 {
|
||||||
|
err := tx.Update(&mb)
|
||||||
|
x.Checkf(ctx, err, "updating mailbox")
|
||||||
|
if mb.MailboxCounts != origmb.MailboxCounts {
|
||||||
|
changes = append(changes, mb.ChangeCounts())
|
||||||
|
}
|
||||||
|
if mb.KeywordsChanged(origmb) {
|
||||||
|
changes = append(changes, mb.ChangeKeywords())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
mb = x.mailboxID(ctx, tx, m.MailboxID)
|
||||||
|
origmb = mb
|
||||||
|
}
|
||||||
|
mb.Keywords, _ = store.MergeKeywords(mb.Keywords, keywords)
|
||||||
|
|
||||||
|
mb.Sub(m.MailboxCounts())
|
||||||
|
oflags := m.Flags
|
||||||
|
m.Flags = m.Flags.Set(flags, flags)
|
||||||
|
var kwChanged bool
|
||||||
|
m.Keywords, kwChanged = store.MergeKeywords(m.Keywords, keywords)
|
||||||
|
mb.Add(m.MailboxCounts())
|
||||||
|
|
||||||
|
if m.Flags == oflags && !kwChanged {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
if modseq == 0 {
|
||||||
|
modseq, err = acc.NextModSeq(tx)
|
||||||
|
x.Checkf(ctx, err, "assigning next modseq")
|
||||||
|
}
|
||||||
|
m.ModSeq = modseq
|
||||||
|
err = tx.Update(&m)
|
||||||
|
x.Checkf(ctx, err, "updating message")
|
||||||
|
|
||||||
|
changes = append(changes, m.ChangeFlags(oflags))
|
||||||
|
retrain = append(retrain, m)
|
||||||
|
}
|
||||||
|
|
||||||
|
if mb.ID != 0 {
|
||||||
|
err := tx.Update(&mb)
|
||||||
|
x.Checkf(ctx, err, "updating mailbox")
|
||||||
|
if mb.MailboxCounts != origmb.MailboxCounts {
|
||||||
|
changes = append(changes, mb.ChangeCounts())
|
||||||
|
}
|
||||||
|
if mb.KeywordsChanged(origmb) {
|
||||||
|
changes = append(changes, mb.ChangeKeywords())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
err = acc.RetrainMessages(ctx, log, tx, retrain, false)
|
||||||
|
x.Checkf(ctx, err, "retraining messages")
|
||||||
|
})
|
||||||
|
|
||||||
|
store.BroadcastChanges(acc, changes)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func (x XOps) MessageFlagsClear(ctx context.Context, log mlog.Log, acc *store.Account, messageIDs []int64, flaglist []string) {
|
||||||
|
flags, keywords, err := store.ParseFlagsKeywords(flaglist)
|
||||||
|
x.Checkuserf(ctx, err, "parsing flags")
|
||||||
|
|
||||||
|
acc.WithRLock(func() {
|
||||||
|
var retrain []store.Message
|
||||||
|
var changes []store.Change
|
||||||
|
|
||||||
|
x.DBWrite(ctx, acc, func(tx *bstore.Tx) {
|
||||||
|
var modseq store.ModSeq
|
||||||
|
var mb, origmb store.Mailbox
|
||||||
|
|
||||||
|
for _, mid := range messageIDs {
|
||||||
|
m := x.messageID(ctx, tx, mid)
|
||||||
|
|
||||||
|
if mb.ID != m.MailboxID {
|
||||||
|
if mb.ID != 0 {
|
||||||
|
err := tx.Update(&mb)
|
||||||
|
x.Checkf(ctx, err, "updating counts for mailbox")
|
||||||
|
if mb.MailboxCounts != origmb.MailboxCounts {
|
||||||
|
changes = append(changes, mb.ChangeCounts())
|
||||||
|
}
|
||||||
|
// note: cannot remove keywords from mailbox by removing keywords from message.
|
||||||
|
}
|
||||||
|
mb = x.mailboxID(ctx, tx, m.MailboxID)
|
||||||
|
origmb = mb
|
||||||
|
}
|
||||||
|
|
||||||
|
oflags := m.Flags
|
||||||
|
mb.Sub(m.MailboxCounts())
|
||||||
|
m.Flags = m.Flags.Set(flags, store.Flags{})
|
||||||
|
var changed bool
|
||||||
|
m.Keywords, changed = store.RemoveKeywords(m.Keywords, keywords)
|
||||||
|
mb.Add(m.MailboxCounts())
|
||||||
|
|
||||||
|
if m.Flags == oflags && !changed {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
if modseq == 0 {
|
||||||
|
modseq, err = acc.NextModSeq(tx)
|
||||||
|
x.Checkf(ctx, err, "assigning next modseq")
|
||||||
|
}
|
||||||
|
m.ModSeq = modseq
|
||||||
|
err = tx.Update(&m)
|
||||||
|
x.Checkf(ctx, err, "updating message")
|
||||||
|
|
||||||
|
changes = append(changes, m.ChangeFlags(oflags))
|
||||||
|
retrain = append(retrain, m)
|
||||||
|
}
|
||||||
|
|
||||||
|
if mb.ID != 0 {
|
||||||
|
err := tx.Update(&mb)
|
||||||
|
x.Checkf(ctx, err, "updating keywords in mailbox")
|
||||||
|
if mb.MailboxCounts != origmb.MailboxCounts {
|
||||||
|
changes = append(changes, mb.ChangeCounts())
|
||||||
|
}
|
||||||
|
// note: cannot remove keywords from mailbox by removing keywords from message.
|
||||||
|
}
|
||||||
|
|
||||||
|
err = acc.RetrainMessages(ctx, log, tx, retrain, false)
|
||||||
|
x.Checkf(ctx, err, "retraining messages")
|
||||||
|
})
|
||||||
|
|
||||||
|
store.BroadcastChanges(acc, changes)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// MessageMove moves messages to the mailbox represented by mailboxName, or to mailboxID if mailboxName is empty.
|
||||||
|
func (x XOps) MessageMove(ctx context.Context, log mlog.Log, acc *store.Account, messageIDs []int64, mailboxName string, mailboxID int64) {
|
||||||
|
acc.WithRLock(func() {
|
||||||
|
retrain := make([]store.Message, 0, len(messageIDs))
|
||||||
|
removeChanges := map[int64]store.ChangeRemoveUIDs{}
|
||||||
|
// n adds, 1 remove, 2 mailboxcounts, optimistic and at least for a single message.
|
||||||
|
changes := make([]store.Change, 0, len(messageIDs)+3)
|
||||||
|
|
||||||
|
x.DBWrite(ctx, acc, func(tx *bstore.Tx) {
|
||||||
|
var mbSrc store.Mailbox
|
||||||
|
var modseq store.ModSeq
|
||||||
|
|
||||||
|
if mailboxName != "" {
|
||||||
|
mb, err := acc.MailboxFind(tx, mailboxName)
|
||||||
|
x.Checkf(ctx, err, "looking up mailbox name")
|
||||||
|
if mb == nil {
|
||||||
|
x.Checkuserf(ctx, errors.New("not found"), "looking up mailbox name")
|
||||||
|
} else {
|
||||||
|
mailboxID = mb.ID
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
mbDst := x.mailboxID(ctx, tx, mailboxID)
|
||||||
|
|
||||||
|
if len(messageIDs) == 0 {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
keywords := map[string]struct{}{}
|
||||||
|
|
||||||
|
for _, mid := range messageIDs {
|
||||||
|
m := x.messageID(ctx, tx, mid)
|
||||||
|
|
||||||
|
// We may have loaded this mailbox in the previous iteration of this loop.
|
||||||
|
if m.MailboxID != mbSrc.ID {
|
||||||
|
if mbSrc.ID != 0 {
|
||||||
|
err := tx.Update(&mbSrc)
|
||||||
|
x.Checkf(ctx, err, "updating source mailbox counts")
|
||||||
|
changes = append(changes, mbSrc.ChangeCounts())
|
||||||
|
}
|
||||||
|
mbSrc = x.mailboxID(ctx, tx, m.MailboxID)
|
||||||
|
}
|
||||||
|
|
||||||
|
if mbSrc.ID == mailboxID {
|
||||||
|
// Client should filter out messages that are already in mailbox.
|
||||||
|
x.Checkuserf(ctx, errors.New("already in destination mailbox"), "moving message")
|
||||||
|
}
|
||||||
|
|
||||||
|
var err error
|
||||||
|
if modseq == 0 {
|
||||||
|
modseq, err = acc.NextModSeq(tx)
|
||||||
|
x.Checkf(ctx, err, "assigning next modseq")
|
||||||
|
}
|
||||||
|
|
||||||
|
ch := removeChanges[m.MailboxID]
|
||||||
|
ch.UIDs = append(ch.UIDs, m.UID)
|
||||||
|
ch.ModSeq = modseq
|
||||||
|
ch.MailboxID = m.MailboxID
|
||||||
|
removeChanges[m.MailboxID] = ch
|
||||||
|
|
||||||
|
// Copy of message record that we'll insert when UID is freed up.
|
||||||
|
om := m
|
||||||
|
om.PrepareExpunge()
|
||||||
|
om.ID = 0 // Assign new ID.
|
||||||
|
om.ModSeq = modseq
|
||||||
|
|
||||||
|
mbSrc.Sub(m.MailboxCounts())
|
||||||
|
|
||||||
|
if mbDst.Trash {
|
||||||
|
m.Seen = true
|
||||||
|
}
|
||||||
|
conf, _ := acc.Conf()
|
||||||
|
m.MailboxID = mbDst.ID
|
||||||
|
if m.IsReject && m.MailboxDestinedID != 0 {
|
||||||
|
// Incorrectly delivered to Rejects mailbox. Adjust MailboxOrigID so this message
|
||||||
|
// is used for reputation calculation during future deliveries.
|
||||||
|
m.MailboxOrigID = m.MailboxDestinedID
|
||||||
|
m.IsReject = false
|
||||||
|
m.Seen = false
|
||||||
|
}
|
||||||
|
m.UID = mbDst.UIDNext
|
||||||
|
m.ModSeq = modseq
|
||||||
|
mbDst.UIDNext++
|
||||||
|
m.JunkFlagsForMailbox(mbDst, conf)
|
||||||
|
err = tx.Update(&m)
|
||||||
|
x.Checkf(ctx, err, "updating moved message in database")
|
||||||
|
|
||||||
|
// Now that UID is unused, we can insert the old record again.
|
||||||
|
err = tx.Insert(&om)
|
||||||
|
x.Checkf(ctx, err, "inserting record for expunge after moving message")
|
||||||
|
|
||||||
|
mbDst.Add(m.MailboxCounts())
|
||||||
|
|
||||||
|
changes = append(changes, m.ChangeAddUID())
|
||||||
|
retrain = append(retrain, m)
|
||||||
|
|
||||||
|
for _, kw := range m.Keywords {
|
||||||
|
keywords[kw] = struct{}{}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
err := tx.Update(&mbSrc)
|
||||||
|
x.Checkf(ctx, err, "updating source mailbox counts")
|
||||||
|
|
||||||
|
changes = append(changes, mbSrc.ChangeCounts(), mbDst.ChangeCounts())
|
||||||
|
|
||||||
|
// Ensure destination mailbox has keywords of the moved messages.
|
||||||
|
var mbKwChanged bool
|
||||||
|
mbDst.Keywords, mbKwChanged = store.MergeKeywords(mbDst.Keywords, maps.Keys(keywords))
|
||||||
|
if mbKwChanged {
|
||||||
|
changes = append(changes, mbDst.ChangeKeywords())
|
||||||
|
}
|
||||||
|
|
||||||
|
err = tx.Update(&mbDst)
|
||||||
|
x.Checkf(ctx, err, "updating mailbox with uidnext")
|
||||||
|
|
||||||
|
err = acc.RetrainMessages(ctx, log, tx, retrain, false)
|
||||||
|
x.Checkf(ctx, err, "retraining messages after move")
|
||||||
|
})
|
||||||
|
|
||||||
|
// Ensure UIDs of the removed message are in increasing order. It is quite common
|
||||||
|
// for all messages to be from a single source mailbox, meaning this is just one
|
||||||
|
// change, for which we preallocated space.
|
||||||
|
for _, ch := range removeChanges {
|
||||||
|
sort.Slice(ch.UIDs, func(i, j int) bool {
|
||||||
|
return ch.UIDs[i] < ch.UIDs[j]
|
||||||
|
})
|
||||||
|
changes = append(changes, ch)
|
||||||
|
}
|
||||||
|
store.BroadcastChanges(acc, changes)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func isText(p message.Part) bool {
|
||||||
|
return p.MediaType == "" && p.MediaSubType == "" || p.MediaType == "TEXT" && p.MediaSubType == "PLAIN"
|
||||||
|
}
|
||||||
|
|
||||||
|
func isHTML(p message.Part) bool {
|
||||||
|
return p.MediaType == "" && p.MediaSubType == "" || p.MediaType == "TEXT" && p.MediaSubType == "HTML"
|
||||||
|
}
|
||||||
|
|
||||||
|
func isAlternative(p message.Part) bool {
|
||||||
|
return p.MediaType == "MULTIPART" && p.MediaSubType == "ALTERNATIVE"
|
||||||
|
}
|
||||||
|
|
||||||
|
func readPart(p message.Part, maxSize int64) (string, error) {
|
||||||
|
buf, err := io.ReadAll(io.LimitReader(p.ReaderUTF8OrBinary(), maxSize))
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("reading part contents: %v", err)
|
||||||
|
}
|
||||||
|
return string(buf), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ReadableParts returns the contents of the first text and/or html parts,
|
||||||
|
// descending into multiparts, truncated to maxSize bytes if longer.
|
||||||
|
func ReadableParts(p message.Part, maxSize int64) (text string, html string, found bool, err error) {
|
||||||
|
// todo: may want to merge this logic with webmail's message parsing.
|
||||||
|
|
||||||
|
// For non-multipart messages, top-level part.
|
||||||
|
if isText(p) {
|
||||||
|
data, err := readPart(p, maxSize)
|
||||||
|
return data, "", true, err
|
||||||
|
} else if isHTML(p) {
|
||||||
|
data, err := readPart(p, maxSize)
|
||||||
|
return "", data, true, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Look in sub-parts. Stop when we have a readable part, don't continue with other
|
||||||
|
// subparts unless we have a multipart/alternative.
|
||||||
|
// todo: we may have to look at disposition "inline".
|
||||||
|
var haveText, haveHTML bool
|
||||||
|
for _, pp := range p.Parts {
|
||||||
|
if isText(pp) {
|
||||||
|
haveText = true
|
||||||
|
text, err = readPart(pp, maxSize)
|
||||||
|
if !isAlternative(p) {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
} else if isHTML(pp) {
|
||||||
|
haveHTML = true
|
||||||
|
html, err = readPart(pp, maxSize)
|
||||||
|
if !isAlternative(p) {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if haveText || haveHTML {
|
||||||
|
return text, html, true, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Descend into the subparts.
|
||||||
|
for _, pp := range p.Parts {
|
||||||
|
text, html, found, err = ReadableParts(pp, maxSize)
|
||||||
|
if found {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
|
@ -307,6 +307,35 @@ and can enable REQUIRETLS by default.
|
||||||
See [webmail screenshots](../screenshots/#hdr-webmail).
|
See [webmail screenshots](../screenshots/#hdr-webmail).
|
||||||
|
|
||||||
|
|
||||||
|
## Webapi and webhooks
|
||||||
|
|
||||||
|
The webapi and webhooks make it easy to send/receive transactional email with
|
||||||
|
only HTTP/JSON, not requiring detailed knowledge of and/or libraries for
|
||||||
|
composing email messages (internet message format, IMF), SMTP for submission,
|
||||||
|
and IMAP for handling incoming messages including delivery status notifications
|
||||||
|
(DSNs).
|
||||||
|
|
||||||
|
Outgoing webhooks notify about events for outgoing deliveries (such as
|
||||||
|
"delivered", "delayed", "failed", "suppressed").
|
||||||
|
|
||||||
|
Incoming webhooks notify about incoming deliveries.
|
||||||
|
|
||||||
|
The webapi can be used to submit messages to the queue, and to process incoming
|
||||||
|
messages, for example by moving them to another mailbox, setting/clearing flags
|
||||||
|
or deleting them.
|
||||||
|
|
||||||
|
Per-account suppression lists, automatically managed based on SMTP status codes
|
||||||
|
and DSN messages, protect the reputation of your mail server.
|
||||||
|
|
||||||
|
For API documentation and examples of the webapi and webhooks, see
|
||||||
|
https://pkg.go.dev/github.com/mjl-/mox/webapi/. Earlier mox versions can be
|
||||||
|
selected in the top left (at the time of writing).
|
||||||
|
|
||||||
|
The mox webapi endpoint at /webapi/v0/ lists available methods and links to
|
||||||
|
them, each method page showing an example request and response JSON object and
|
||||||
|
lets you call the method.
|
||||||
|
|
||||||
|
|
||||||
## Internationalized email
|
## Internationalized email
|
||||||
|
|
||||||
Originally, email addresses were ASCII-only. An email address consists of a
|
Originally, email addresses were ASCII-only. An email address consists of a
|
||||||
|
|
|
@ -481,6 +481,7 @@ h2 { background: linear-gradient(90deg, #6dd5fd 0%, #77e8e3 100%); display: inli
|
||||||
External links:
|
External links:
|
||||||
<ul style="list-style: none">
|
<ul style="list-style: none">
|
||||||
<li><a href="https://github.com/mjl-/mox">Sources at github</a></li>
|
<li><a href="https://github.com/mjl-/mox">Sources at github</a></li>
|
||||||
|
<li><a href="https://pkg.go.dev/github.com/mjl-/mox/webapi/">Webapi & webhooks</a></li>
|
||||||
</ul>
|
</ul>
|
||||||
</div>
|
</div>
|
||||||
</nav>
|
</nav>
|
||||||
|
|
Loading…
Reference in a new issue