mirror of
https://github.com/Kozea/Radicale.git
synced 2025-04-02 20:57:37 +03:00
Synced with origin
This commit is contained in:
parent
9b3bb2de2b
commit
cf81d1f9a7
94 changed files with 5096 additions and 3560 deletions
541
CHANGELOG.md
Normal file
541
CHANGELOG.md
Normal file
|
@ -0,0 +1,541 @@
|
|||
# Changelog
|
||||
|
||||
## master
|
||||
|
||||
## 3.1.8
|
||||
|
||||
* Fix setuptools requirement if installing wheel
|
||||
* Tests: Switch from `python setup.py test` to `tox`
|
||||
* Small changes to build system configuration and tests
|
||||
|
||||
## 3.1.7
|
||||
|
||||
* Fix random href fallback
|
||||
|
||||
## 3.1.6
|
||||
|
||||
* Ignore `Not a directory` error for optional config paths
|
||||
* Fix upload of whole address book/calendar with UIDs that collide on
|
||||
case-insensitive filesystem
|
||||
* Remove runtime dependency on setuptools for Python>=3.9
|
||||
* Windows: Block ADS paths
|
||||
|
||||
## 3.1.5
|
||||
|
||||
* Ignore configuration file if access is denied
|
||||
* Use F_FULLFSYNC with PyPy on MacOS
|
||||
* Fallback if F_FULLFSYNC is not supported by the filesystem
|
||||
|
||||
## 3.1.4
|
||||
|
||||
* Fallback if RENAME_EXCHANGE is not supported by the filesystem
|
||||
* Assume POSIX compatibility if `sys.platform` is not `win32`
|
||||
|
||||
## 3.1.3
|
||||
|
||||
* Redirect '…/.well-known/caldav' and '…/.well-known/carddav' to base prefix
|
||||
* Warning instead of error when base prefix ends with '/'
|
||||
|
||||
## 3.1.2
|
||||
|
||||
* Verify that base prefix starts with '/' but doesn't end with '/'
|
||||
* Improve base prefix log message
|
||||
* Never send body for HEAD requests (again)
|
||||
|
||||
## 3.1.1
|
||||
|
||||
* Workaround for contact photo bug in InfCloud
|
||||
* Redirect GET and HEAD requests under `/.web` to sanitized path
|
||||
* Set `Content-Length` header for HEAD requests
|
||||
* Never send body for HEAD requests
|
||||
* Improve error messages for `from_file` rights backend
|
||||
* Don't sanitize WSGI script name
|
||||
|
||||
## 3.1.0
|
||||
|
||||
* Single `<D:propstat>` element in PROPPATCH response
|
||||
* Allow multiple `<D:set>` and `<D:remove>` elements
|
||||
* Improve log messages
|
||||
* Fix date filter
|
||||
* Improve sanitization of collection properties
|
||||
* Cancel mkcalendar request on error
|
||||
* Use **renameat2** on Linux for atomic overwriting of collections
|
||||
* Command Line Parser
|
||||
* Disallow abbreviated arguments
|
||||
* Support backend specific options and HTTP headers
|
||||
* Optional argument for boolean options
|
||||
* Load no config file for `--config` without argument
|
||||
* Allow float for server->timeout setting
|
||||
* Fix **is-not-defined** filter in **addressbook-query** report
|
||||
* Add python type hints
|
||||
* Add **multifilesystem_nolock** storage
|
||||
* Add support for Python 3.9 and 3.10
|
||||
* Drop support for Python 3.5
|
||||
* Fix compatibility with Evolution (Exceptions from recurrence rules)
|
||||
|
||||
## 3.0.6
|
||||
|
||||
* Allow web plugins to handle POST requests
|
||||
|
||||
## 3.0.5
|
||||
|
||||
* Start storage hook in own process group
|
||||
* Kill storage hook on error or exit
|
||||
* Try to kill child processes of storage hook
|
||||
* Internal Server: Exit immediately when signal is received
|
||||
(do not wait for clients or storage hook to finish)
|
||||
|
||||
## 3.0.4
|
||||
|
||||
* Fix internal server on FreeBSD
|
||||
|
||||
## 3.0.3
|
||||
|
||||
* Fix internal server on OpenBSD
|
||||
|
||||
## 3.0.2
|
||||
|
||||
* Use 403 response for supported-report and valid-sync-token errors
|
||||
* Internal server: Handle missing IPv6 support
|
||||
|
||||
## 3.0.1
|
||||
|
||||
* Fix XML error messages
|
||||
|
||||
## 3.0.0
|
||||
|
||||
This release is incompatible with previous releases.
|
||||
See the upgrade checklist below.
|
||||
|
||||
* Parallel write requests
|
||||
* Support PyPy
|
||||
* Protect against XML denial-of-service attacks
|
||||
* Check for duplicated UIDs in calendars/address books
|
||||
* Only add missing UIDs for uploaded whole calendars/address books
|
||||
* Switch from md5 to sha256 for UIDs and tokens
|
||||
* Code cleanup:
|
||||
* All plugin interfaces were simplified and are incompatible with
|
||||
old plugins
|
||||
* Major refactor
|
||||
* Never sanitize paths multiple times (check if they are sanitized)
|
||||
* Config
|
||||
* Multiple configuration files separated by `:` (resp. `;`
|
||||
on Windows)
|
||||
* Optional configuration files by prepending file path with `?`
|
||||
* Check validity of every configuration file and command line
|
||||
arguments separately
|
||||
* Report the source of invalid configuration parameters in
|
||||
error messages
|
||||
* Code cleanup:
|
||||
* Store configuration as parsed values
|
||||
* Use Schema that describes configuration and allow plugins to apply
|
||||
their own schemas
|
||||
* Mark internal settings with `_`
|
||||
* Internal server
|
||||
* Bind to IPv4 and IPv6 address, when both are available for hostname
|
||||
* Set default address to `localhost:5232`
|
||||
* Remove settings for SSL ciphers and protocol versions (enforce safe
|
||||
defaults instead)
|
||||
* Remove settings for file locking because they are of little use
|
||||
* Remove daemonization (should be handled by service managers)
|
||||
* Logging
|
||||
* Replace complex Python logger configuration with simple
|
||||
`logging.level` setting
|
||||
* Write PID and `threadName` instead of cryptic id's in log messages
|
||||
* Use `wsgi.errors` for logging (as required by the WSGI spec)
|
||||
* Code cleanup:
|
||||
* Don't pass logger object around (use `logging.getLogger()`
|
||||
instead)
|
||||
* Auth
|
||||
* Use `md5` as default for `htpasswd_encryption` setting
|
||||
* Move setting `realm` from section `server` to `auth`
|
||||
* Rights
|
||||
* Use permissions `RW` for non-leaf collections and `rw` for
|
||||
address books/calendars
|
||||
* New permission `i` that only allows access with HTTP method GET
|
||||
(CalDAV/CardDAV is susceptible to expensive search requests)
|
||||
* Web
|
||||
* Add upload dialog for calendars/address books from file
|
||||
* Show startup loading message
|
||||
* Show warning if JavaScript is disabled
|
||||
* Pass HTML Validator
|
||||
* Storage
|
||||
* Check for missing UIDs in items
|
||||
* Check for child collections in address books and calendars
|
||||
* Code cleanup:
|
||||
* Split BaseCollection in BaseStorage and BaseCollection
|
||||
|
||||
## Upgrade checklist
|
||||
|
||||
* Config
|
||||
* Some settings were removed
|
||||
* The default of `auth.htpasswd_encryption` changed to `md5`
|
||||
* The setting `server.realm` moved to `auth.realm`
|
||||
* The setting `logging.debug` was replaced by `logging.level`
|
||||
* The format of the `rights.file` configuration file changed:
|
||||
* Permission `r` replaced by `Rr`
|
||||
* Permission `w` replaced by `Ww`
|
||||
* New permission `i` added as subset of `r`
|
||||
* Replaced variable `%(login)s` by `{user}`
|
||||
* Removed variable `%(path)s`
|
||||
* `{` must be escaped as `{{` and `}` as `}}` in regexes
|
||||
* File system storage
|
||||
* The storage format is compatible with Radicale 2.x.x
|
||||
* Run `radicale --verify-storage` to check for errors
|
||||
* Custom plugins:
|
||||
* `auth` and `web` plugins require minor adjustments
|
||||
* `rights` plugins must be adapted to the new permission model
|
||||
* `storage` plugins require major changes
|
||||
|
||||
## 2.1.10 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Update required versions for dependencies
|
||||
* Get `RADICALE_CONFIG` from WSGI environ
|
||||
* Improve HTTP status codes
|
||||
* Fix race condition in storage lock creation
|
||||
* Raise default limits for content length and timeout
|
||||
* Log output from hook
|
||||
|
||||
## 2.1.9 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Specify versions for dependencies
|
||||
* Move WSGI initialization into module
|
||||
* Check if `REPORT` method is actually supported
|
||||
* Include `rights` file in source distribution
|
||||
* Specify `md5` and `bcrypt` as extras
|
||||
* Improve logging messages
|
||||
* Windows: Fix crash when item path is a directory
|
||||
|
||||
## 2.1.8 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Flush files before fsync'ing
|
||||
|
||||
## 2.1.7 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Don't print warning when cache format changes
|
||||
* Add documentation for `BaseAuth`
|
||||
* Add `is_authenticated2(login, user, password)` to `BaseAuth`
|
||||
* Fix names of custom properties in PROPFIND requests with
|
||||
`D:propname` or `D:allprop`
|
||||
* Return all properties in PROPFIND requests with `D:propname` or
|
||||
`D:allprop`
|
||||
* Allow `D:displayname` property on all collections
|
||||
* Answer with `D:unauthenticated` for `D:current-user-principal` property
|
||||
when not logged in
|
||||
* Remove non-existing `ICAL:calendar-color` and `C:calendar-timezone`
|
||||
properties from PROPFIND requests with `D:propname` or `D:allprop`
|
||||
* Add `D:owner` property to calendar and address book objects
|
||||
* Remove `D:getetag` and `D:getlastmodified` properties from regular
|
||||
collections
|
||||
|
||||
## 2.1.6 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Fix content-type of VLIST
|
||||
* Specify correct COMPONENT in content-type of VCALENDAR
|
||||
* Cache COMPONENT of calendar objects (improves speed with some clients)
|
||||
* Stricter parsing of filters
|
||||
* Improve support for CardDAV filter
|
||||
* Fix some smaller bugs in CalDAV filter
|
||||
* Add X-WR-CALNAME and X-WR-CALDESC to calendars downloaded via HTTP/WebDAV
|
||||
* Use X-WR-CALNAME and X-WR-CALDESC from calendars published via WebDAV
|
||||
|
||||
## 2.1.5 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Add `--verify-storage` command-line argument
|
||||
* Allow comments in the htpasswd file
|
||||
* Don't strip whitespaces from user names and passwords in the htpasswd file
|
||||
* Remove cookies from logging output
|
||||
* Allow uploads of whole collections with many components
|
||||
* Show warning message if server.timeout is used with Python < 3.5.2
|
||||
|
||||
## 2.1.4 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Fix incorrect time range matching and calculation for some edge-cases with
|
||||
rescheduled recurrences
|
||||
* Fix owner property
|
||||
|
||||
## 2.1.3 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Enable timeout for SSL handshakes and move them out of the main thread
|
||||
* Create cache entries during upload of items
|
||||
* Stop built-in server on Windows when Ctrl+C is pressed
|
||||
* Prevent slow down when multiple requests hit a collection during cache warm-up
|
||||
|
||||
## 2.1.2 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Remove workarounds for bugs in VObject < 0.9.5
|
||||
* Error checking of collection tags and associated components
|
||||
* Improve error checking of uploaded collections and components
|
||||
* Don't delete empty collection properties implicitly
|
||||
* Improve logging of VObject serialization
|
||||
|
||||
## 2.1.1 - Wild Radish Again
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Add missing UIDs instead of failing
|
||||
* Improve error checking of calendar and address book objects
|
||||
* Fix upload of whole address books
|
||||
|
||||
## 2.1.0 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Built-in web interface for creating and managing address books and calendars
|
||||
* can be extended with web plugins
|
||||
* Much faster storage backend
|
||||
* Significant reduction in memory usage
|
||||
* Improved logging
|
||||
* Include paths (of invalid items / requests) in log messages
|
||||
* Include configuration values causing problems in log messages
|
||||
* Log warning message for invalid requests by clients
|
||||
* Log error message for invalid files in the storage backend
|
||||
* No stack traces unless debugging is enabled
|
||||
* Time range filter also regards overwritten recurrences
|
||||
* Items that couldn't be filtered because of bugs in VObject are always
|
||||
returned (and a warning message is logged)
|
||||
* Basic error checking of configuration files
|
||||
* File system locking isn't disabled implicitly anymore, instead a new
|
||||
configuration option gets introduced
|
||||
* The permissions of the lock file are not changed anymore
|
||||
* Support for sync-token
|
||||
* Support for client-side SSL certificates
|
||||
* Rights plugins can decide if access to an item is granted explicitly
|
||||
* Respond with 403 instead of 404 for principal collections of non-existing
|
||||
users when `owner_only` plugin is used (information leakage)
|
||||
* Authentication plugins can provide the login and password from the
|
||||
environment
|
||||
* new `remote_user` plugin, that gets the login from the `REMOTE_USER`
|
||||
environment variable (for WSGI server)
|
||||
* new `http_x_remote_user` plugin, that gets the login from the
|
||||
`X-Remote-User` HTTP header (for reverse proxies)
|
||||
|
||||
## 2.0.0 - Little Big Radish
|
||||
|
||||
This feature is not compatible with the 1.x.x versions. Follow our
|
||||
[migration guide](https://radicale.org/2.1.html#documentation/migration-from-1xx-to-2xx)
|
||||
if you want to switch from 1.x.x to 2.0.0.
|
||||
|
||||
* Support Python 3.3+ only, Python 2 is not supported anymore
|
||||
* Keep only one simple filesystem-based storage system
|
||||
* Remove built-in Git support
|
||||
* Remove built-in authentication modules
|
||||
* Keep the WSGI interface, use Python HTTP server by default
|
||||
* Use a real iCal parser, rely on the "vobject" external module
|
||||
* Add a solid calendar discovery
|
||||
* Respect the difference between "files" and "folders", don't rely on slashes
|
||||
* Remove the calendar creation with GET requests
|
||||
* Be stateless
|
||||
* Use a file locker
|
||||
* Add threading
|
||||
* Get atomic writes
|
||||
* Support new filters
|
||||
* Support read-only permissions
|
||||
* Allow External plugins for authentication, rights management, storage and
|
||||
version control
|
||||
|
||||
## 1.1.4 - Fifth Law of Nature
|
||||
|
||||
* Use `shutil.move` for `--export-storage`
|
||||
|
||||
## 1.1.3 - Fourth Law of Nature
|
||||
|
||||
* Add a `--export-storage=FOLDER` command-line argument (by Unrud, see #606)
|
||||
|
||||
## 1.1.2 - Third Law of Nature
|
||||
|
||||
* **Security fix**: Add a random timer to avoid timing oracles and simple
|
||||
bruteforce attacks when using the htpasswd authentication method.
|
||||
* Various minor fixes.
|
||||
|
||||
## 1.1.1 - Second Law of Nature
|
||||
|
||||
* Fix the owner_write rights rule
|
||||
|
||||
## 1.1 - Law of Nature
|
||||
|
||||
One feature in this release is **not backward compatible**:
|
||||
|
||||
* Use the first matching section for rights (inspired from daald)
|
||||
|
||||
Now, the first section matching the path and current user in your custom rights
|
||||
file is used. In the previous versions, the most permissive rights of all the
|
||||
matching sections were applied. This new behaviour gives a simple way to make
|
||||
specific rules at the top of the file independant from the generic ones.
|
||||
|
||||
Many **improvements in this release are related to security**, you should
|
||||
upgrade Radicale as soon as possible:
|
||||
|
||||
* Improve the regex used for well-known URIs (by Unrud)
|
||||
* Prevent regex injection in rights management (by Unrud)
|
||||
* Prevent crafted HTTP request from calling arbitrary functions (by Unrud)
|
||||
* Improve URI sanitation and conversion to filesystem path (by Unrud)
|
||||
* Decouple the daemon from its parent environment (by Unrud)
|
||||
|
||||
Some bugs have been fixed and little enhancements have been added:
|
||||
|
||||
* Assign new items to corret key (by Unrud)
|
||||
* Avoid race condition in PID file creation (by Unrud)
|
||||
* Improve the docker version (by cdpb)
|
||||
* Encode message and commiter for git commits
|
||||
* Test with Python 3.5
|
||||
|
||||
## 1.0.1 - Sunflower Again
|
||||
|
||||
* Update the version because of a **stupid** "feature"™ of PyPI
|
||||
|
||||
## 1.0 - Sunflower
|
||||
|
||||
* Enhanced performances (by Mathieu Dupuy)
|
||||
* Add MD5-APR1 and BCRYPT for htpasswd-based authentication (by Jan-Philip Gehrcke)
|
||||
* Use PAM service (by Stephen Paul Weber)
|
||||
* Don't discard PROPPATCH on empty collections (by Markus Unterwaditzer)
|
||||
* Write the path of the collection in the git message (by Matthew Monaco)
|
||||
* Tests launched on Travis
|
||||
|
||||
## 0.10 - Lovely Endless Grass
|
||||
|
||||
* Support well-known URLs (by Mathieu Dupuy)
|
||||
* Fix collection discovery (by Markus Unterwaditzer)
|
||||
* Reload logger config on SIGHUP (by Élie Bouttier)
|
||||
* Remove props files when deleting a collection (by Vincent Untz)
|
||||
* Support salted SHA1 passwords (by Marc Kleine-Budde)
|
||||
* Don't spam the logs about non-SSL IMAP connections to localhost (by Giel van Schijndel)
|
||||
|
||||
## 0.9 - Rivers
|
||||
|
||||
* Custom handlers for auth, storage and rights (by Sergey Fursov)
|
||||
* 1-file-per-event storage (by Jean-Marc Martins)
|
||||
* Git support for filesystem storages (by Jean-Marc Martins)
|
||||
* DB storage working with PostgreSQL, MariaDB and SQLite (by Jean-Marc Martins)
|
||||
* Clean rights manager based on regular expressions (by Sweil)
|
||||
* Support of contacts for Apple's clients
|
||||
* Support colors (by Jochen Sprickerhof)
|
||||
* Decode URLs in XML (by Jean-Marc Martins)
|
||||
* Fix PAM authentication (by Stepan Henek)
|
||||
* Use consistent etags (by 9m66p93w)
|
||||
* Use consistent sorting order (by Daniel Danner)
|
||||
* Return 401 on unauthorized DELETE requests (by Eduard Braun)
|
||||
* Move pid file creation in child process (by Mathieu Dupuy)
|
||||
* Allow requests without base_prefix (by jheidemann)
|
||||
|
||||
## 0.8 - Rainbow
|
||||
|
||||
* New authentication and rights management modules (by Matthias Jordan)
|
||||
* Experimental database storage
|
||||
* Command-line option for custom configuration file (by Mark Adams)
|
||||
* Root URL not at the root of a domain (by Clint Adams, Fabrice Bellet, Vincent Untz)
|
||||
* Improved support for iCal, CalDAVSync, CardDAVSync, CalDavZAP and CardDavMATE
|
||||
* Empty PROPFIND requests handled (by Christoph Polcin)
|
||||
* Colon allowed in passwords
|
||||
* Configurable realm message
|
||||
|
||||
## 0.7.1 - Waterfalls
|
||||
|
||||
* Many address books fixes
|
||||
* New IMAP ACL (by Daniel Aleksandersen)
|
||||
* PAM ACL fixed (by Daniel Aleksandersen)
|
||||
* Courier ACL fixed (by Benjamin Frank)
|
||||
* Always set display name to collections (by Oskari Timperi)
|
||||
* Various DELETE responses fixed
|
||||
|
||||
## 0.7 - Eternal Sunshine
|
||||
|
||||
* Repeating events
|
||||
* Collection deletion
|
||||
* Courier and PAM authentication methods
|
||||
* CardDAV support
|
||||
* Custom LDAP filters supported
|
||||
|
||||
## 0.6.4 - Tulips
|
||||
|
||||
* Fix the installation with Python 3.1
|
||||
|
||||
## 0.6.3 - Red Roses
|
||||
|
||||
* MOVE requests fixed
|
||||
* Faster REPORT answers
|
||||
* Executable script moved into the package
|
||||
|
||||
## 0.6.2 - Seeds
|
||||
|
||||
* iPhone and iPad support fixed
|
||||
* Backslashes replaced by slashes in PROPFIND answers on Windows
|
||||
* PyPI archive set as default download URL
|
||||
|
||||
## 0.6.1 - Growing Up
|
||||
|
||||
* Example files included in the tarball
|
||||
* htpasswd support fixed
|
||||
* Redirection loop bug fixed
|
||||
* Testing message on GET requests
|
||||
|
||||
## 0.6 - Sapling
|
||||
|
||||
* WSGI support
|
||||
* IPv6 support
|
||||
* Smart, verbose and configurable logs
|
||||
* Apple iCal 4 and iPhone support (by Łukasz Langa)
|
||||
* KDE KOrganizer support
|
||||
* LDAP auth backend (by Corentin Le Bail)
|
||||
* Public and private calendars (by René Neumann)
|
||||
* PID file
|
||||
* MOVE requests management
|
||||
* Journal entries support
|
||||
* Drop Python 2.5 support
|
||||
|
||||
## 0.5 - Historical Artifacts
|
||||
|
||||
* Calendar depth
|
||||
* MacOS and Windows support
|
||||
* HEAD requests management
|
||||
* htpasswd user from calendar path
|
||||
|
||||
## 0.4 - Hot Days Back
|
||||
|
||||
* Personal calendars
|
||||
* Last-Modified HTTP header
|
||||
* `no-ssl` and `foreground` options
|
||||
* Default configuration file
|
||||
|
||||
## 0.3 - Dancing Flowers
|
||||
|
||||
* Evolution support
|
||||
* Version management
|
||||
|
||||
## 0.2 - Snowflakes
|
||||
|
||||
* Sunbird pre-1.0 support
|
||||
* SSL connection
|
||||
* Htpasswd authentication
|
||||
* Daemon mode
|
||||
* User configuration
|
||||
* Twisted dependency removed
|
||||
* Python 3 support
|
||||
* Real URLs for PUT and DELETE
|
||||
* Concurrent modification reported to users
|
||||
* Many bugs fixed (by Roger Wenham)
|
||||
|
||||
## 0.1 - Crazy Vegetables
|
||||
|
||||
* First release
|
||||
* Lightning/Sunbird 0.9 compatibility
|
||||
* Easy installer
|
674
COPYING
674
COPYING
|
@ -1,674 +0,0 @@
|
|||
GNU GENERAL PUBLIC LICENSE
|
||||
Version 3, 29 June 2007
|
||||
|
||||
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
Preamble
|
||||
|
||||
The GNU General Public License is a free, copyleft license for
|
||||
software and other kinds of works.
|
||||
|
||||
The licenses for most software and other practical works are designed
|
||||
to take away your freedom to share and change the works. By contrast,
|
||||
the GNU General Public License is intended to guarantee your freedom to
|
||||
share and change all versions of a program--to make sure it remains free
|
||||
software for all its users. We, the Free Software Foundation, use the
|
||||
GNU General Public License for most of our software; it applies also to
|
||||
any other work released this way by its authors. You can apply it to
|
||||
your programs, too.
|
||||
|
||||
When we speak of free software, we are referring to freedom, not
|
||||
price. Our General Public Licenses are designed to make sure that you
|
||||
have the freedom to distribute copies of free software (and charge for
|
||||
them if you wish), that you receive source code or can get it if you
|
||||
want it, that you can change the software or use pieces of it in new
|
||||
free programs, and that you know you can do these things.
|
||||
|
||||
To protect your rights, we need to prevent others from denying you
|
||||
these rights or asking you to surrender the rights. Therefore, you have
|
||||
certain responsibilities if you distribute copies of the software, or if
|
||||
you modify it: responsibilities to respect the freedom of others.
|
||||
|
||||
For example, if you distribute copies of such a program, whether
|
||||
gratis or for a fee, you must pass on to the recipients the same
|
||||
freedoms that you received. You must make sure that they, too, receive
|
||||
or can get the source code. And you must show them these terms so they
|
||||
know their rights.
|
||||
|
||||
Developers that use the GNU GPL protect your rights with two steps:
|
||||
(1) assert copyright on the software, and (2) offer you this License
|
||||
giving you legal permission to copy, distribute and/or modify it.
|
||||
|
||||
For the developers' and authors' protection, the GPL clearly explains
|
||||
that there is no warranty for this free software. For both users' and
|
||||
authors' sake, the GPL requires that modified versions be marked as
|
||||
changed, so that their problems will not be attributed erroneously to
|
||||
authors of previous versions.
|
||||
|
||||
Some devices are designed to deny users access to install or run
|
||||
modified versions of the software inside them, although the manufacturer
|
||||
can do so. This is fundamentally incompatible with the aim of
|
||||
protecting users' freedom to change the software. The systematic
|
||||
pattern of such abuse occurs in the area of products for individuals to
|
||||
use, which is precisely where it is most unacceptable. Therefore, we
|
||||
have designed this version of the GPL to prohibit the practice for those
|
||||
products. If such problems arise substantially in other domains, we
|
||||
stand ready to extend this provision to those domains in future versions
|
||||
of the GPL, as needed to protect the freedom of users.
|
||||
|
||||
Finally, every program is threatened constantly by software patents.
|
||||
States should not allow patents to restrict development and use of
|
||||
software on general-purpose computers, but in those that do, we wish to
|
||||
avoid the special danger that patents applied to a free program could
|
||||
make it effectively proprietary. To prevent this, the GPL assures that
|
||||
patents cannot be used to render the program non-free.
|
||||
|
||||
The precise terms and conditions for copying, distribution and
|
||||
modification follow.
|
||||
|
||||
TERMS AND CONDITIONS
|
||||
|
||||
0. Definitions.
|
||||
|
||||
"This License" refers to version 3 of the GNU General Public License.
|
||||
|
||||
"Copyright" also means copyright-like laws that apply to other kinds of
|
||||
works, such as semiconductor masks.
|
||||
|
||||
"The Program" refers to any copyrightable work licensed under this
|
||||
License. Each licensee is addressed as "you". "Licensees" and
|
||||
"recipients" may be individuals or organizations.
|
||||
|
||||
To "modify" a work means to copy from or adapt all or part of the work
|
||||
in a fashion requiring copyright permission, other than the making of an
|
||||
exact copy. The resulting work is called a "modified version" of the
|
||||
earlier work or a work "based on" the earlier work.
|
||||
|
||||
A "covered work" means either the unmodified Program or a work based
|
||||
on the Program.
|
||||
|
||||
To "propagate" a work means to do anything with it that, without
|
||||
permission, would make you directly or secondarily liable for
|
||||
infringement under applicable copyright law, except executing it on a
|
||||
computer or modifying a private copy. Propagation includes copying,
|
||||
distribution (with or without modification), making available to the
|
||||
public, and in some countries other activities as well.
|
||||
|
||||
To "convey" a work means any kind of propagation that enables other
|
||||
parties to make or receive copies. Mere interaction with a user through
|
||||
a computer network, with no transfer of a copy, is not conveying.
|
||||
|
||||
An interactive user interface displays "Appropriate Legal Notices"
|
||||
to the extent that it includes a convenient and prominently visible
|
||||
feature that (1) displays an appropriate copyright notice, and (2)
|
||||
tells the user that there is no warranty for the work (except to the
|
||||
extent that warranties are provided), that licensees may convey the
|
||||
work under this License, and how to view a copy of this License. If
|
||||
the interface presents a list of user commands or options, such as a
|
||||
menu, a prominent item in the list meets this criterion.
|
||||
|
||||
1. Source Code.
|
||||
|
||||
The "source code" for a work means the preferred form of the work
|
||||
for making modifications to it. "Object code" means any non-source
|
||||
form of a work.
|
||||
|
||||
A "Standard Interface" means an interface that either is an official
|
||||
standard defined by a recognized standards body, or, in the case of
|
||||
interfaces specified for a particular programming language, one that
|
||||
is widely used among developers working in that language.
|
||||
|
||||
The "System Libraries" of an executable work include anything, other
|
||||
than the work as a whole, that (a) is included in the normal form of
|
||||
packaging a Major Component, but which is not part of that Major
|
||||
Component, and (b) serves only to enable use of the work with that
|
||||
Major Component, or to implement a Standard Interface for which an
|
||||
implementation is available to the public in source code form. A
|
||||
"Major Component", in this context, means a major essential component
|
||||
(kernel, window system, and so on) of the specific operating system
|
||||
(if any) on which the executable work runs, or a compiler used to
|
||||
produce the work, or an object code interpreter used to run it.
|
||||
|
||||
The "Corresponding Source" for a work in object code form means all
|
||||
the source code needed to generate, install, and (for an executable
|
||||
work) run the object code and to modify the work, including scripts to
|
||||
control those activities. However, it does not include the work's
|
||||
System Libraries, or general-purpose tools or generally available free
|
||||
programs which are used unmodified in performing those activities but
|
||||
which are not part of the work. For example, Corresponding Source
|
||||
includes interface definition files associated with source files for
|
||||
the work, and the source code for shared libraries and dynamically
|
||||
linked subprograms that the work is specifically designed to require,
|
||||
such as by intimate data communication or control flow between those
|
||||
subprograms and other parts of the work.
|
||||
|
||||
The Corresponding Source need not include anything that users
|
||||
can regenerate automatically from other parts of the Corresponding
|
||||
Source.
|
||||
|
||||
The Corresponding Source for a work in source code form is that
|
||||
same work.
|
||||
|
||||
2. Basic Permissions.
|
||||
|
||||
All rights granted under this License are granted for the term of
|
||||
copyright on the Program, and are irrevocable provided the stated
|
||||
conditions are met. This License explicitly affirms your unlimited
|
||||
permission to run the unmodified Program. The output from running a
|
||||
covered work is covered by this License only if the output, given its
|
||||
content, constitutes a covered work. This License acknowledges your
|
||||
rights of fair use or other equivalent, as provided by copyright law.
|
||||
|
||||
You may make, run and propagate covered works that you do not
|
||||
convey, without conditions so long as your license otherwise remains
|
||||
in force. You may convey covered works to others for the sole purpose
|
||||
of having them make modifications exclusively for you, or provide you
|
||||
with facilities for running those works, provided that you comply with
|
||||
the terms of this License in conveying all material for which you do
|
||||
not control copyright. Those thus making or running the covered works
|
||||
for you must do so exclusively on your behalf, under your direction
|
||||
and control, on terms that prohibit them from making any copies of
|
||||
your copyrighted material outside their relationship with you.
|
||||
|
||||
Conveying under any other circumstances is permitted solely under
|
||||
the conditions stated below. Sublicensing is not allowed; section 10
|
||||
makes it unnecessary.
|
||||
|
||||
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
||||
|
||||
No covered work shall be deemed part of an effective technological
|
||||
measure under any applicable law fulfilling obligations under article
|
||||
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||
similar laws prohibiting or restricting circumvention of such
|
||||
measures.
|
||||
|
||||
When you convey a covered work, you waive any legal power to forbid
|
||||
circumvention of technological measures to the extent such circumvention
|
||||
is effected by exercising rights under this License with respect to
|
||||
the covered work, and you disclaim any intention to limit operation or
|
||||
modification of the work as a means of enforcing, against the work's
|
||||
users, your or third parties' legal rights to forbid circumvention of
|
||||
technological measures.
|
||||
|
||||
4. Conveying Verbatim Copies.
|
||||
|
||||
You may convey verbatim copies of the Program's source code as you
|
||||
receive it, in any medium, provided that you conspicuously and
|
||||
appropriately publish on each copy an appropriate copyright notice;
|
||||
keep intact all notices stating that this License and any
|
||||
non-permissive terms added in accord with section 7 apply to the code;
|
||||
keep intact all notices of the absence of any warranty; and give all
|
||||
recipients a copy of this License along with the Program.
|
||||
|
||||
You may charge any price or no price for each copy that you convey,
|
||||
and you may offer support or warranty protection for a fee.
|
||||
|
||||
5. Conveying Modified Source Versions.
|
||||
|
||||
You may convey a work based on the Program, or the modifications to
|
||||
produce it from the Program, in the form of source code under the
|
||||
terms of section 4, provided that you also meet all of these conditions:
|
||||
|
||||
a) The work must carry prominent notices stating that you modified
|
||||
it, and giving a relevant date.
|
||||
|
||||
b) The work must carry prominent notices stating that it is
|
||||
released under this License and any conditions added under section
|
||||
7. This requirement modifies the requirement in section 4 to
|
||||
"keep intact all notices".
|
||||
|
||||
c) You must license the entire work, as a whole, under this
|
||||
License to anyone who comes into possession of a copy. This
|
||||
License will therefore apply, along with any applicable section 7
|
||||
additional terms, to the whole of the work, and all its parts,
|
||||
regardless of how they are packaged. This License gives no
|
||||
permission to license the work in any other way, but it does not
|
||||
invalidate such permission if you have separately received it.
|
||||
|
||||
d) If the work has interactive user interfaces, each must display
|
||||
Appropriate Legal Notices; however, if the Program has interactive
|
||||
interfaces that do not display Appropriate Legal Notices, your
|
||||
work need not make them do so.
|
||||
|
||||
A compilation of a covered work with other separate and independent
|
||||
works, which are not by their nature extensions of the covered work,
|
||||
and which are not combined with it such as to form a larger program,
|
||||
in or on a volume of a storage or distribution medium, is called an
|
||||
"aggregate" if the compilation and its resulting copyright are not
|
||||
used to limit the access or legal rights of the compilation's users
|
||||
beyond what the individual works permit. Inclusion of a covered work
|
||||
in an aggregate does not cause this License to apply to the other
|
||||
parts of the aggregate.
|
||||
|
||||
6. Conveying Non-Source Forms.
|
||||
|
||||
You may convey a covered work in object code form under the terms
|
||||
of sections 4 and 5, provided that you also convey the
|
||||
machine-readable Corresponding Source under the terms of this License,
|
||||
in one of these ways:
|
||||
|
||||
a) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by the
|
||||
Corresponding Source fixed on a durable physical medium
|
||||
customarily used for software interchange.
|
||||
|
||||
b) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by a
|
||||
written offer, valid for at least three years and valid for as
|
||||
long as you offer spare parts or customer support for that product
|
||||
model, to give anyone who possesses the object code either (1) a
|
||||
copy of the Corresponding Source for all the software in the
|
||||
product that is covered by this License, on a durable physical
|
||||
medium customarily used for software interchange, for a price no
|
||||
more than your reasonable cost of physically performing this
|
||||
conveying of source, or (2) access to copy the
|
||||
Corresponding Source from a network server at no charge.
|
||||
|
||||
c) Convey individual copies of the object code with a copy of the
|
||||
written offer to provide the Corresponding Source. This
|
||||
alternative is allowed only occasionally and noncommercially, and
|
||||
only if you received the object code with such an offer, in accord
|
||||
with subsection 6b.
|
||||
|
||||
d) Convey the object code by offering access from a designated
|
||||
place (gratis or for a charge), and offer equivalent access to the
|
||||
Corresponding Source in the same way through the same place at no
|
||||
further charge. You need not require recipients to copy the
|
||||
Corresponding Source along with the object code. If the place to
|
||||
copy the object code is a network server, the Corresponding Source
|
||||
may be on a different server (operated by you or a third party)
|
||||
that supports equivalent copying facilities, provided you maintain
|
||||
clear directions next to the object code saying where to find the
|
||||
Corresponding Source. Regardless of what server hosts the
|
||||
Corresponding Source, you remain obligated to ensure that it is
|
||||
available for as long as needed to satisfy these requirements.
|
||||
|
||||
e) Convey the object code using peer-to-peer transmission, provided
|
||||
you inform other peers where the object code and Corresponding
|
||||
Source of the work are being offered to the general public at no
|
||||
charge under subsection 6d.
|
||||
|
||||
A separable portion of the object code, whose source code is excluded
|
||||
from the Corresponding Source as a System Library, need not be
|
||||
included in conveying the object code work.
|
||||
|
||||
A "User Product" is either (1) a "consumer product", which means any
|
||||
tangible personal property which is normally used for personal, family,
|
||||
or household purposes, or (2) anything designed or sold for incorporation
|
||||
into a dwelling. In determining whether a product is a consumer product,
|
||||
doubtful cases shall be resolved in favor of coverage. For a particular
|
||||
product received by a particular user, "normally used" refers to a
|
||||
typical or common use of that class of product, regardless of the status
|
||||
of the particular user or of the way in which the particular user
|
||||
actually uses, or expects or is expected to use, the product. A product
|
||||
is a consumer product regardless of whether the product has substantial
|
||||
commercial, industrial or non-consumer uses, unless such uses represent
|
||||
the only significant mode of use of the product.
|
||||
|
||||
"Installation Information" for a User Product means any methods,
|
||||
procedures, authorization keys, or other information required to install
|
||||
and execute modified versions of a covered work in that User Product from
|
||||
a modified version of its Corresponding Source. The information must
|
||||
suffice to ensure that the continued functioning of the modified object
|
||||
code is in no case prevented or interfered with solely because
|
||||
modification has been made.
|
||||
|
||||
If you convey an object code work under this section in, or with, or
|
||||
specifically for use in, a User Product, and the conveying occurs as
|
||||
part of a transaction in which the right of possession and use of the
|
||||
User Product is transferred to the recipient in perpetuity or for a
|
||||
fixed term (regardless of how the transaction is characterized), the
|
||||
Corresponding Source conveyed under this section must be accompanied
|
||||
by the Installation Information. But this requirement does not apply
|
||||
if neither you nor any third party retains the ability to install
|
||||
modified object code on the User Product (for example, the work has
|
||||
been installed in ROM).
|
||||
|
||||
The requirement to provide Installation Information does not include a
|
||||
requirement to continue to provide support service, warranty, or updates
|
||||
for a work that has been modified or installed by the recipient, or for
|
||||
the User Product in which it has been modified or installed. Access to a
|
||||
network may be denied when the modification itself materially and
|
||||
adversely affects the operation of the network or violates the rules and
|
||||
protocols for communication across the network.
|
||||
|
||||
Corresponding Source conveyed, and Installation Information provided,
|
||||
in accord with this section must be in a format that is publicly
|
||||
documented (and with an implementation available to the public in
|
||||
source code form), and must require no special password or key for
|
||||
unpacking, reading or copying.
|
||||
|
||||
7. Additional Terms.
|
||||
|
||||
"Additional permissions" are terms that supplement the terms of this
|
||||
License by making exceptions from one or more of its conditions.
|
||||
Additional permissions that are applicable to the entire Program shall
|
||||
be treated as though they were included in this License, to the extent
|
||||
that they are valid under applicable law. If additional permissions
|
||||
apply only to part of the Program, that part may be used separately
|
||||
under those permissions, but the entire Program remains governed by
|
||||
this License without regard to the additional permissions.
|
||||
|
||||
When you convey a copy of a covered work, you may at your option
|
||||
remove any additional permissions from that copy, or from any part of
|
||||
it. (Additional permissions may be written to require their own
|
||||
removal in certain cases when you modify the work.) You may place
|
||||
additional permissions on material, added by you to a covered work,
|
||||
for which you have or can give appropriate copyright permission.
|
||||
|
||||
Notwithstanding any other provision of this License, for material you
|
||||
add to a covered work, you may (if authorized by the copyright holders of
|
||||
that material) supplement the terms of this License with terms:
|
||||
|
||||
a) Disclaiming warranty or limiting liability differently from the
|
||||
terms of sections 15 and 16 of this License; or
|
||||
|
||||
b) Requiring preservation of specified reasonable legal notices or
|
||||
author attributions in that material or in the Appropriate Legal
|
||||
Notices displayed by works containing it; or
|
||||
|
||||
c) Prohibiting misrepresentation of the origin of that material, or
|
||||
requiring that modified versions of such material be marked in
|
||||
reasonable ways as different from the original version; or
|
||||
|
||||
d) Limiting the use for publicity purposes of names of licensors or
|
||||
authors of the material; or
|
||||
|
||||
e) Declining to grant rights under trademark law for use of some
|
||||
trade names, trademarks, or service marks; or
|
||||
|
||||
f) Requiring indemnification of licensors and authors of that
|
||||
material by anyone who conveys the material (or modified versions of
|
||||
it) with contractual assumptions of liability to the recipient, for
|
||||
any liability that these contractual assumptions directly impose on
|
||||
those licensors and authors.
|
||||
|
||||
All other non-permissive additional terms are considered "further
|
||||
restrictions" within the meaning of section 10. If the Program as you
|
||||
received it, or any part of it, contains a notice stating that it is
|
||||
governed by this License along with a term that is a further
|
||||
restriction, you may remove that term. If a license document contains
|
||||
a further restriction but permits relicensing or conveying under this
|
||||
License, you may add to a covered work material governed by the terms
|
||||
of that license document, provided that the further restriction does
|
||||
not survive such relicensing or conveying.
|
||||
|
||||
If you add terms to a covered work in accord with this section, you
|
||||
must place, in the relevant source files, a statement of the
|
||||
additional terms that apply to those files, or a notice indicating
|
||||
where to find the applicable terms.
|
||||
|
||||
Additional terms, permissive or non-permissive, may be stated in the
|
||||
form of a separately written license, or stated as exceptions;
|
||||
the above requirements apply either way.
|
||||
|
||||
8. Termination.
|
||||
|
||||
You may not propagate or modify a covered work except as expressly
|
||||
provided under this License. Any attempt otherwise to propagate or
|
||||
modify it is void, and will automatically terminate your rights under
|
||||
this License (including any patent licenses granted under the third
|
||||
paragraph of section 11).
|
||||
|
||||
However, if you cease all violation of this License, then your
|
||||
license from a particular copyright holder is reinstated (a)
|
||||
provisionally, unless and until the copyright holder explicitly and
|
||||
finally terminates your license, and (b) permanently, if the copyright
|
||||
holder fails to notify you of the violation by some reasonable means
|
||||
prior to 60 days after the cessation.
|
||||
|
||||
Moreover, your license from a particular copyright holder is
|
||||
reinstated permanently if the copyright holder notifies you of the
|
||||
violation by some reasonable means, this is the first time you have
|
||||
received notice of violation of this License (for any work) from that
|
||||
copyright holder, and you cure the violation prior to 30 days after
|
||||
your receipt of the notice.
|
||||
|
||||
Termination of your rights under this section does not terminate the
|
||||
licenses of parties who have received copies or rights from you under
|
||||
this License. If your rights have been terminated and not permanently
|
||||
reinstated, you do not qualify to receive new licenses for the same
|
||||
material under section 10.
|
||||
|
||||
9. Acceptance Not Required for Having Copies.
|
||||
|
||||
You are not required to accept this License in order to receive or
|
||||
run a copy of the Program. Ancillary propagation of a covered work
|
||||
occurring solely as a consequence of using peer-to-peer transmission
|
||||
to receive a copy likewise does not require acceptance. However,
|
||||
nothing other than this License grants you permission to propagate or
|
||||
modify any covered work. These actions infringe copyright if you do
|
||||
not accept this License. Therefore, by modifying or propagating a
|
||||
covered work, you indicate your acceptance of this License to do so.
|
||||
|
||||
10. Automatic Licensing of Downstream Recipients.
|
||||
|
||||
Each time you convey a covered work, the recipient automatically
|
||||
receives a license from the original licensors, to run, modify and
|
||||
propagate that work, subject to this License. You are not responsible
|
||||
for enforcing compliance by third parties with this License.
|
||||
|
||||
An "entity transaction" is a transaction transferring control of an
|
||||
organization, or substantially all assets of one, or subdividing an
|
||||
organization, or merging organizations. If propagation of a covered
|
||||
work results from an entity transaction, each party to that
|
||||
transaction who receives a copy of the work also receives whatever
|
||||
licenses to the work the party's predecessor in interest had or could
|
||||
give under the previous paragraph, plus a right to possession of the
|
||||
Corresponding Source of the work from the predecessor in interest, if
|
||||
the predecessor has it or can get it with reasonable efforts.
|
||||
|
||||
You may not impose any further restrictions on the exercise of the
|
||||
rights granted or affirmed under this License. For example, you may
|
||||
not impose a license fee, royalty, or other charge for exercise of
|
||||
rights granted under this License, and you may not initiate litigation
|
||||
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||
any patent claim is infringed by making, using, selling, offering for
|
||||
sale, or importing the Program or any portion of it.
|
||||
|
||||
11. Patents.
|
||||
|
||||
A "contributor" is a copyright holder who authorizes use under this
|
||||
License of the Program or a work on which the Program is based. The
|
||||
work thus licensed is called the contributor's "contributor version".
|
||||
|
||||
A contributor's "essential patent claims" are all patent claims
|
||||
owned or controlled by the contributor, whether already acquired or
|
||||
hereafter acquired, that would be infringed by some manner, permitted
|
||||
by this License, of making, using, or selling its contributor version,
|
||||
but do not include claims that would be infringed only as a
|
||||
consequence of further modification of the contributor version. For
|
||||
purposes of this definition, "control" includes the right to grant
|
||||
patent sublicenses in a manner consistent with the requirements of
|
||||
this License.
|
||||
|
||||
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||
patent license under the contributor's essential patent claims, to
|
||||
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||
propagate the contents of its contributor version.
|
||||
|
||||
In the following three paragraphs, a "patent license" is any express
|
||||
agreement or commitment, however denominated, not to enforce a patent
|
||||
(such as an express permission to practice a patent or covenant not to
|
||||
sue for patent infringement). To "grant" such a patent license to a
|
||||
party means to make such an agreement or commitment not to enforce a
|
||||
patent against the party.
|
||||
|
||||
If you convey a covered work, knowingly relying on a patent license,
|
||||
and the Corresponding Source of the work is not available for anyone
|
||||
to copy, free of charge and under the terms of this License, through a
|
||||
publicly available network server or other readily accessible means,
|
||||
then you must either (1) cause the Corresponding Source to be so
|
||||
available, or (2) arrange to deprive yourself of the benefit of the
|
||||
patent license for this particular work, or (3) arrange, in a manner
|
||||
consistent with the requirements of this License, to extend the patent
|
||||
license to downstream recipients. "Knowingly relying" means you have
|
||||
actual knowledge that, but for the patent license, your conveying the
|
||||
covered work in a country, or your recipient's use of the covered work
|
||||
in a country, would infringe one or more identifiable patents in that
|
||||
country that you have reason to believe are valid.
|
||||
|
||||
If, pursuant to or in connection with a single transaction or
|
||||
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||
covered work, and grant a patent license to some of the parties
|
||||
receiving the covered work authorizing them to use, propagate, modify
|
||||
or convey a specific copy of the covered work, then the patent license
|
||||
you grant is automatically extended to all recipients of the covered
|
||||
work and works based on it.
|
||||
|
||||
A patent license is "discriminatory" if it does not include within
|
||||
the scope of its coverage, prohibits the exercise of, or is
|
||||
conditioned on the non-exercise of one or more of the rights that are
|
||||
specifically granted under this License. You may not convey a covered
|
||||
work if you are a party to an arrangement with a third party that is
|
||||
in the business of distributing software, under which you make payment
|
||||
to the third party based on the extent of your activity of conveying
|
||||
the work, and under which the third party grants, to any of the
|
||||
parties who would receive the covered work from you, a discriminatory
|
||||
patent license (a) in connection with copies of the covered work
|
||||
conveyed by you (or copies made from those copies), or (b) primarily
|
||||
for and in connection with specific products or compilations that
|
||||
contain the covered work, unless you entered into that arrangement,
|
||||
or that patent license was granted, prior to 28 March 2007.
|
||||
|
||||
Nothing in this License shall be construed as excluding or limiting
|
||||
any implied license or other defenses to infringement that may
|
||||
otherwise be available to you under applicable patent law.
|
||||
|
||||
12. No Surrender of Others' Freedom.
|
||||
|
||||
If conditions are imposed on you (whether by court order, agreement or
|
||||
otherwise) that contradict the conditions of this License, they do not
|
||||
excuse you from the conditions of this License. If you cannot convey a
|
||||
covered work so as to satisfy simultaneously your obligations under this
|
||||
License and any other pertinent obligations, then as a consequence you may
|
||||
not convey it at all. For example, if you agree to terms that obligate you
|
||||
to collect a royalty for further conveying from those to whom you convey
|
||||
the Program, the only way you could satisfy both those terms and this
|
||||
License would be to refrain entirely from conveying the Program.
|
||||
|
||||
13. Use with the GNU Affero General Public License.
|
||||
|
||||
Notwithstanding any other provision of this License, you have
|
||||
permission to link or combine any covered work with a work licensed
|
||||
under version 3 of the GNU Affero General Public License into a single
|
||||
combined work, and to convey the resulting work. The terms of this
|
||||
License will continue to apply to the part which is the covered work,
|
||||
but the special requirements of the GNU Affero General Public License,
|
||||
section 13, concerning interaction through a network will apply to the
|
||||
combination as such.
|
||||
|
||||
14. Revised Versions of this License.
|
||||
|
||||
The Free Software Foundation may publish revised and/or new versions of
|
||||
the GNU General Public License from time to time. Such new versions will
|
||||
be similar in spirit to the present version, but may differ in detail to
|
||||
address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the
|
||||
Program specifies that a certain numbered version of the GNU General
|
||||
Public License "or any later version" applies to it, you have the
|
||||
option of following the terms and conditions either of that numbered
|
||||
version or of any later version published by the Free Software
|
||||
Foundation. If the Program does not specify a version number of the
|
||||
GNU General Public License, you may choose any version ever published
|
||||
by the Free Software Foundation.
|
||||
|
||||
If the Program specifies that a proxy can decide which future
|
||||
versions of the GNU General Public License can be used, that proxy's
|
||||
public statement of acceptance of a version permanently authorizes you
|
||||
to choose that version for the Program.
|
||||
|
||||
Later license versions may give you additional or different
|
||||
permissions. However, no additional obligations are imposed on any
|
||||
author or copyright holder as a result of your choosing to follow a
|
||||
later version.
|
||||
|
||||
15. Disclaimer of Warranty.
|
||||
|
||||
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
|
||||
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
|
||||
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
|
||||
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
|
||||
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||
|
||||
16. Limitation of Liability.
|
||||
|
||||
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
|
||||
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
|
||||
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
|
||||
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
|
||||
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
|
||||
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
|
||||
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
|
||||
SUCH DAMAGES.
|
||||
|
||||
17. Interpretation of Sections 15 and 16.
|
||||
|
||||
If the disclaimer of warranty and limitation of liability provided
|
||||
above cannot be given local legal effect according to their terms,
|
||||
reviewing courts shall apply local law that most closely approximates
|
||||
an absolute waiver of all civil liability in connection with the
|
||||
Program, unless a warranty or assumption of liability accompanies a
|
||||
copy of the Program in return for a fee.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
How to Apply These Terms to Your New Programs
|
||||
|
||||
If you develop a new program, and you want it to be of the greatest
|
||||
possible use to the public, the best way to achieve this is to make it
|
||||
free software which everyone can redistribute and change under these terms.
|
||||
|
||||
To do so, attach the following notices to the program. It is safest
|
||||
to attach them to the start of each source file to most effectively
|
||||
state the exclusion of warranty; and each file should have at least
|
||||
the "copyright" line and a pointer to where the full notice is found.
|
||||
|
||||
<one line to give the program's name and a brief idea of what it does.>
|
||||
Copyright (C) <year> <name of author>
|
||||
|
||||
This program is free software: you can redistribute it and/or modify
|
||||
it under the terms of the GNU General Public License as published by
|
||||
the Free Software Foundation, either version 3 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU General Public License
|
||||
along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
Also add information on how to contact you by electronic and paper mail.
|
||||
|
||||
If the program does terminal interaction, make it output a short
|
||||
notice like this when it starts in an interactive mode:
|
||||
|
||||
<program> Copyright (C) <year> <name of author>
|
||||
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
|
||||
This is free software, and you are welcome to redistribute it
|
||||
under certain conditions; type `show c' for details.
|
||||
|
||||
The hypothetical commands `show w' and `show c' should show the appropriate
|
||||
parts of the General Public License. Of course, your program's commands
|
||||
might be different; for a GUI interface, you would use an "about box".
|
||||
|
||||
You should also get your employer (if you work as a programmer) or school,
|
||||
if any, to sign a "copyright disclaimer" for the program, if necessary.
|
||||
For more information on this, and how to apply and follow the GNU GPL, see
|
||||
<http://www.gnu.org/licenses/>.
|
||||
|
||||
The GNU General Public License does not permit incorporating your program
|
||||
into proprietary programs. If your program is a subroutine library, you
|
||||
may consider it more useful to permit linking proprietary applications with
|
||||
the library. If this is what you want to do, use the GNU Lesser General
|
||||
Public License instead of this License. But first, please read
|
||||
<http://www.gnu.org/philosophy/why-not-lgpl.html>.
|
675
COPYING.md
Normal file
675
COPYING.md
Normal file
|
@ -0,0 +1,675 @@
|
|||
### GNU GENERAL PUBLIC LICENSE
|
||||
|
||||
Version 3, 29 June 2007
|
||||
|
||||
Copyright (C) 2007 Free Software Foundation, Inc.
|
||||
<https://fsf.org/>
|
||||
|
||||
Everyone is permitted to copy and distribute verbatim copies of this
|
||||
license document, but changing it is not allowed.
|
||||
|
||||
### Preamble
|
||||
|
||||
The GNU General Public License is a free, copyleft license for
|
||||
software and other kinds of works.
|
||||
|
||||
The licenses for most software and other practical works are designed
|
||||
to take away your freedom to share and change the works. By contrast,
|
||||
the GNU General Public License is intended to guarantee your freedom
|
||||
to share and change all versions of a program--to make sure it remains
|
||||
free software for all its users. We, the Free Software Foundation, use
|
||||
the GNU General Public License for most of our software; it applies
|
||||
also to any other work released this way by its authors. You can apply
|
||||
it to your programs, too.
|
||||
|
||||
When we speak of free software, we are referring to freedom, not
|
||||
price. Our General Public Licenses are designed to make sure that you
|
||||
have the freedom to distribute copies of free software (and charge for
|
||||
them if you wish), that you receive source code or can get it if you
|
||||
want it, that you can change the software or use pieces of it in new
|
||||
free programs, and that you know you can do these things.
|
||||
|
||||
To protect your rights, we need to prevent others from denying you
|
||||
these rights or asking you to surrender the rights. Therefore, you
|
||||
have certain responsibilities if you distribute copies of the
|
||||
software, or if you modify it: responsibilities to respect the freedom
|
||||
of others.
|
||||
|
||||
For example, if you distribute copies of such a program, whether
|
||||
gratis or for a fee, you must pass on to the recipients the same
|
||||
freedoms that you received. You must make sure that they, too, receive
|
||||
or can get the source code. And you must show them these terms so they
|
||||
know their rights.
|
||||
|
||||
Developers that use the GNU GPL protect your rights with two steps:
|
||||
(1) assert copyright on the software, and (2) offer you this License
|
||||
giving you legal permission to copy, distribute and/or modify it.
|
||||
|
||||
For the developers' and authors' protection, the GPL clearly explains
|
||||
that there is no warranty for this free software. For both users' and
|
||||
authors' sake, the GPL requires that modified versions be marked as
|
||||
changed, so that their problems will not be attributed erroneously to
|
||||
authors of previous versions.
|
||||
|
||||
Some devices are designed to deny users access to install or run
|
||||
modified versions of the software inside them, although the
|
||||
manufacturer can do so. This is fundamentally incompatible with the
|
||||
aim of protecting users' freedom to change the software. The
|
||||
systematic pattern of such abuse occurs in the area of products for
|
||||
individuals to use, which is precisely where it is most unacceptable.
|
||||
Therefore, we have designed this version of the GPL to prohibit the
|
||||
practice for those products. If such problems arise substantially in
|
||||
other domains, we stand ready to extend this provision to those
|
||||
domains in future versions of the GPL, as needed to protect the
|
||||
freedom of users.
|
||||
|
||||
Finally, every program is threatened constantly by software patents.
|
||||
States should not allow patents to restrict development and use of
|
||||
software on general-purpose computers, but in those that do, we wish
|
||||
to avoid the special danger that patents applied to a free program
|
||||
could make it effectively proprietary. To prevent this, the GPL
|
||||
assures that patents cannot be used to render the program non-free.
|
||||
|
||||
The precise terms and conditions for copying, distribution and
|
||||
modification follow.
|
||||
|
||||
### TERMS AND CONDITIONS
|
||||
|
||||
#### 0. Definitions.
|
||||
|
||||
"This License" refers to version 3 of the GNU General Public License.
|
||||
|
||||
"Copyright" also means copyright-like laws that apply to other kinds
|
||||
of works, such as semiconductor masks.
|
||||
|
||||
"The Program" refers to any copyrightable work licensed under this
|
||||
License. Each licensee is addressed as "you". "Licensees" and
|
||||
"recipients" may be individuals or organizations.
|
||||
|
||||
To "modify" a work means to copy from or adapt all or part of the work
|
||||
in a fashion requiring copyright permission, other than the making of
|
||||
an exact copy. The resulting work is called a "modified version" of
|
||||
the earlier work or a work "based on" the earlier work.
|
||||
|
||||
A "covered work" means either the unmodified Program or a work based
|
||||
on the Program.
|
||||
|
||||
To "propagate" a work means to do anything with it that, without
|
||||
permission, would make you directly or secondarily liable for
|
||||
infringement under applicable copyright law, except executing it on a
|
||||
computer or modifying a private copy. Propagation includes copying,
|
||||
distribution (with or without modification), making available to the
|
||||
public, and in some countries other activities as well.
|
||||
|
||||
To "convey" a work means any kind of propagation that enables other
|
||||
parties to make or receive copies. Mere interaction with a user
|
||||
through a computer network, with no transfer of a copy, is not
|
||||
conveying.
|
||||
|
||||
An interactive user interface displays "Appropriate Legal Notices" to
|
||||
the extent that it includes a convenient and prominently visible
|
||||
feature that (1) displays an appropriate copyright notice, and (2)
|
||||
tells the user that there is no warranty for the work (except to the
|
||||
extent that warranties are provided), that licensees may convey the
|
||||
work under this License, and how to view a copy of this License. If
|
||||
the interface presents a list of user commands or options, such as a
|
||||
menu, a prominent item in the list meets this criterion.
|
||||
|
||||
#### 1. Source Code.
|
||||
|
||||
The "source code" for a work means the preferred form of the work for
|
||||
making modifications to it. "Object code" means any non-source form of
|
||||
a work.
|
||||
|
||||
A "Standard Interface" means an interface that either is an official
|
||||
standard defined by a recognized standards body, or, in the case of
|
||||
interfaces specified for a particular programming language, one that
|
||||
is widely used among developers working in that language.
|
||||
|
||||
The "System Libraries" of an executable work include anything, other
|
||||
than the work as a whole, that (a) is included in the normal form of
|
||||
packaging a Major Component, but which is not part of that Major
|
||||
Component, and (b) serves only to enable use of the work with that
|
||||
Major Component, or to implement a Standard Interface for which an
|
||||
implementation is available to the public in source code form. A
|
||||
"Major Component", in this context, means a major essential component
|
||||
(kernel, window system, and so on) of the specific operating system
|
||||
(if any) on which the executable work runs, or a compiler used to
|
||||
produce the work, or an object code interpreter used to run it.
|
||||
|
||||
The "Corresponding Source" for a work in object code form means all
|
||||
the source code needed to generate, install, and (for an executable
|
||||
work) run the object code and to modify the work, including scripts to
|
||||
control those activities. However, it does not include the work's
|
||||
System Libraries, or general-purpose tools or generally available free
|
||||
programs which are used unmodified in performing those activities but
|
||||
which are not part of the work. For example, Corresponding Source
|
||||
includes interface definition files associated with source files for
|
||||
the work, and the source code for shared libraries and dynamically
|
||||
linked subprograms that the work is specifically designed to require,
|
||||
such as by intimate data communication or control flow between those
|
||||
subprograms and other parts of the work.
|
||||
|
||||
The Corresponding Source need not include anything that users can
|
||||
regenerate automatically from other parts of the Corresponding Source.
|
||||
|
||||
The Corresponding Source for a work in source code form is that same
|
||||
work.
|
||||
|
||||
#### 2. Basic Permissions.
|
||||
|
||||
All rights granted under this License are granted for the term of
|
||||
copyright on the Program, and are irrevocable provided the stated
|
||||
conditions are met. This License explicitly affirms your unlimited
|
||||
permission to run the unmodified Program. The output from running a
|
||||
covered work is covered by this License only if the output, given its
|
||||
content, constitutes a covered work. This License acknowledges your
|
||||
rights of fair use or other equivalent, as provided by copyright law.
|
||||
|
||||
You may make, run and propagate covered works that you do not convey,
|
||||
without conditions so long as your license otherwise remains in force.
|
||||
You may convey covered works to others for the sole purpose of having
|
||||
them make modifications exclusively for you, or provide you with
|
||||
facilities for running those works, provided that you comply with the
|
||||
terms of this License in conveying all material for which you do not
|
||||
control copyright. Those thus making or running the covered works for
|
||||
you must do so exclusively on your behalf, under your direction and
|
||||
control, on terms that prohibit them from making any copies of your
|
||||
copyrighted material outside their relationship with you.
|
||||
|
||||
Conveying under any other circumstances is permitted solely under the
|
||||
conditions stated below. Sublicensing is not allowed; section 10 makes
|
||||
it unnecessary.
|
||||
|
||||
#### 3. Protecting Users' Legal Rights From Anti-Circumvention Law.
|
||||
|
||||
No covered work shall be deemed part of an effective technological
|
||||
measure under any applicable law fulfilling obligations under article
|
||||
11 of the WIPO copyright treaty adopted on 20 December 1996, or
|
||||
similar laws prohibiting or restricting circumvention of such
|
||||
measures.
|
||||
|
||||
When you convey a covered work, you waive any legal power to forbid
|
||||
circumvention of technological measures to the extent such
|
||||
circumvention is effected by exercising rights under this License with
|
||||
respect to the covered work, and you disclaim any intention to limit
|
||||
operation or modification of the work as a means of enforcing, against
|
||||
the work's users, your or third parties' legal rights to forbid
|
||||
circumvention of technological measures.
|
||||
|
||||
#### 4. Conveying Verbatim Copies.
|
||||
|
||||
You may convey verbatim copies of the Program's source code as you
|
||||
receive it, in any medium, provided that you conspicuously and
|
||||
appropriately publish on each copy an appropriate copyright notice;
|
||||
keep intact all notices stating that this License and any
|
||||
non-permissive terms added in accord with section 7 apply to the code;
|
||||
keep intact all notices of the absence of any warranty; and give all
|
||||
recipients a copy of this License along with the Program.
|
||||
|
||||
You may charge any price or no price for each copy that you convey,
|
||||
and you may offer support or warranty protection for a fee.
|
||||
|
||||
#### 5. Conveying Modified Source Versions.
|
||||
|
||||
You may convey a work based on the Program, or the modifications to
|
||||
produce it from the Program, in the form of source code under the
|
||||
terms of section 4, provided that you also meet all of these
|
||||
conditions:
|
||||
|
||||
- a) The work must carry prominent notices stating that you modified
|
||||
it, and giving a relevant date.
|
||||
- b) The work must carry prominent notices stating that it is
|
||||
released under this License and any conditions added under
|
||||
section 7. This requirement modifies the requirement in section 4
|
||||
to "keep intact all notices".
|
||||
- c) You must license the entire work, as a whole, under this
|
||||
License to anyone who comes into possession of a copy. This
|
||||
License will therefore apply, along with any applicable section 7
|
||||
additional terms, to the whole of the work, and all its parts,
|
||||
regardless of how they are packaged. This License gives no
|
||||
permission to license the work in any other way, but it does not
|
||||
invalidate such permission if you have separately received it.
|
||||
- d) If the work has interactive user interfaces, each must display
|
||||
Appropriate Legal Notices; however, if the Program has interactive
|
||||
interfaces that do not display Appropriate Legal Notices, your
|
||||
work need not make them do so.
|
||||
|
||||
A compilation of a covered work with other separate and independent
|
||||
works, which are not by their nature extensions of the covered work,
|
||||
and which are not combined with it such as to form a larger program,
|
||||
in or on a volume of a storage or distribution medium, is called an
|
||||
"aggregate" if the compilation and its resulting copyright are not
|
||||
used to limit the access or legal rights of the compilation's users
|
||||
beyond what the individual works permit. Inclusion of a covered work
|
||||
in an aggregate does not cause this License to apply to the other
|
||||
parts of the aggregate.
|
||||
|
||||
#### 6. Conveying Non-Source Forms.
|
||||
|
||||
You may convey a covered work in object code form under the terms of
|
||||
sections 4 and 5, provided that you also convey the machine-readable
|
||||
Corresponding Source under the terms of this License, in one of these
|
||||
ways:
|
||||
|
||||
- a) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by the
|
||||
Corresponding Source fixed on a durable physical medium
|
||||
customarily used for software interchange.
|
||||
- b) Convey the object code in, or embodied in, a physical product
|
||||
(including a physical distribution medium), accompanied by a
|
||||
written offer, valid for at least three years and valid for as
|
||||
long as you offer spare parts or customer support for that product
|
||||
model, to give anyone who possesses the object code either (1) a
|
||||
copy of the Corresponding Source for all the software in the
|
||||
product that is covered by this License, on a durable physical
|
||||
medium customarily used for software interchange, for a price no
|
||||
more than your reasonable cost of physically performing this
|
||||
conveying of source, or (2) access to copy the Corresponding
|
||||
Source from a network server at no charge.
|
||||
- c) Convey individual copies of the object code with a copy of the
|
||||
written offer to provide the Corresponding Source. This
|
||||
alternative is allowed only occasionally and noncommercially, and
|
||||
only if you received the object code with such an offer, in accord
|
||||
with subsection 6b.
|
||||
- d) Convey the object code by offering access from a designated
|
||||
place (gratis or for a charge), and offer equivalent access to the
|
||||
Corresponding Source in the same way through the same place at no
|
||||
further charge. You need not require recipients to copy the
|
||||
Corresponding Source along with the object code. If the place to
|
||||
copy the object code is a network server, the Corresponding Source
|
||||
may be on a different server (operated by you or a third party)
|
||||
that supports equivalent copying facilities, provided you maintain
|
||||
clear directions next to the object code saying where to find the
|
||||
Corresponding Source. Regardless of what server hosts the
|
||||
Corresponding Source, you remain obligated to ensure that it is
|
||||
available for as long as needed to satisfy these requirements.
|
||||
- e) Convey the object code using peer-to-peer transmission,
|
||||
provided you inform other peers where the object code and
|
||||
Corresponding Source of the work are being offered to the general
|
||||
public at no charge under subsection 6d.
|
||||
|
||||
A separable portion of the object code, whose source code is excluded
|
||||
from the Corresponding Source as a System Library, need not be
|
||||
included in conveying the object code work.
|
||||
|
||||
A "User Product" is either (1) a "consumer product", which means any
|
||||
tangible personal property which is normally used for personal,
|
||||
family, or household purposes, or (2) anything designed or sold for
|
||||
incorporation into a dwelling. In determining whether a product is a
|
||||
consumer product, doubtful cases shall be resolved in favor of
|
||||
coverage. For a particular product received by a particular user,
|
||||
"normally used" refers to a typical or common use of that class of
|
||||
product, regardless of the status of the particular user or of the way
|
||||
in which the particular user actually uses, or expects or is expected
|
||||
to use, the product. A product is a consumer product regardless of
|
||||
whether the product has substantial commercial, industrial or
|
||||
non-consumer uses, unless such uses represent the only significant
|
||||
mode of use of the product.
|
||||
|
||||
"Installation Information" for a User Product means any methods,
|
||||
procedures, authorization keys, or other information required to
|
||||
install and execute modified versions of a covered work in that User
|
||||
Product from a modified version of its Corresponding Source. The
|
||||
information must suffice to ensure that the continued functioning of
|
||||
the modified object code is in no case prevented or interfered with
|
||||
solely because modification has been made.
|
||||
|
||||
If you convey an object code work under this section in, or with, or
|
||||
specifically for use in, a User Product, and the conveying occurs as
|
||||
part of a transaction in which the right of possession and use of the
|
||||
User Product is transferred to the recipient in perpetuity or for a
|
||||
fixed term (regardless of how the transaction is characterized), the
|
||||
Corresponding Source conveyed under this section must be accompanied
|
||||
by the Installation Information. But this requirement does not apply
|
||||
if neither you nor any third party retains the ability to install
|
||||
modified object code on the User Product (for example, the work has
|
||||
been installed in ROM).
|
||||
|
||||
The requirement to provide Installation Information does not include a
|
||||
requirement to continue to provide support service, warranty, or
|
||||
updates for a work that has been modified or installed by the
|
||||
recipient, or for the User Product in which it has been modified or
|
||||
installed. Access to a network may be denied when the modification
|
||||
itself materially and adversely affects the operation of the network
|
||||
or violates the rules and protocols for communication across the
|
||||
network.
|
||||
|
||||
Corresponding Source conveyed, and Installation Information provided,
|
||||
in accord with this section must be in a format that is publicly
|
||||
documented (and with an implementation available to the public in
|
||||
source code form), and must require no special password or key for
|
||||
unpacking, reading or copying.
|
||||
|
||||
#### 7. Additional Terms.
|
||||
|
||||
"Additional permissions" are terms that supplement the terms of this
|
||||
License by making exceptions from one or more of its conditions.
|
||||
Additional permissions that are applicable to the entire Program shall
|
||||
be treated as though they were included in this License, to the extent
|
||||
that they are valid under applicable law. If additional permissions
|
||||
apply only to part of the Program, that part may be used separately
|
||||
under those permissions, but the entire Program remains governed by
|
||||
this License without regard to the additional permissions.
|
||||
|
||||
When you convey a copy of a covered work, you may at your option
|
||||
remove any additional permissions from that copy, or from any part of
|
||||
it. (Additional permissions may be written to require their own
|
||||
removal in certain cases when you modify the work.) You may place
|
||||
additional permissions on material, added by you to a covered work,
|
||||
for which you have or can give appropriate copyright permission.
|
||||
|
||||
Notwithstanding any other provision of this License, for material you
|
||||
add to a covered work, you may (if authorized by the copyright holders
|
||||
of that material) supplement the terms of this License with terms:
|
||||
|
||||
- a) Disclaiming warranty or limiting liability differently from the
|
||||
terms of sections 15 and 16 of this License; or
|
||||
- b) Requiring preservation of specified reasonable legal notices or
|
||||
author attributions in that material or in the Appropriate Legal
|
||||
Notices displayed by works containing it; or
|
||||
- c) Prohibiting misrepresentation of the origin of that material,
|
||||
or requiring that modified versions of such material be marked in
|
||||
reasonable ways as different from the original version; or
|
||||
- d) Limiting the use for publicity purposes of names of licensors
|
||||
or authors of the material; or
|
||||
- e) Declining to grant rights under trademark law for use of some
|
||||
trade names, trademarks, or service marks; or
|
||||
- f) Requiring indemnification of licensors and authors of that
|
||||
material by anyone who conveys the material (or modified versions
|
||||
of it) with contractual assumptions of liability to the recipient,
|
||||
for any liability that these contractual assumptions directly
|
||||
impose on those licensors and authors.
|
||||
|
||||
All other non-permissive additional terms are considered "further
|
||||
restrictions" within the meaning of section 10. If the Program as you
|
||||
received it, or any part of it, contains a notice stating that it is
|
||||
governed by this License along with a term that is a further
|
||||
restriction, you may remove that term. If a license document contains
|
||||
a further restriction but permits relicensing or conveying under this
|
||||
License, you may add to a covered work material governed by the terms
|
||||
of that license document, provided that the further restriction does
|
||||
not survive such relicensing or conveying.
|
||||
|
||||
If you add terms to a covered work in accord with this section, you
|
||||
must place, in the relevant source files, a statement of the
|
||||
additional terms that apply to those files, or a notice indicating
|
||||
where to find the applicable terms.
|
||||
|
||||
Additional terms, permissive or non-permissive, may be stated in the
|
||||
form of a separately written license, or stated as exceptions; the
|
||||
above requirements apply either way.
|
||||
|
||||
#### 8. Termination.
|
||||
|
||||
You may not propagate or modify a covered work except as expressly
|
||||
provided under this License. Any attempt otherwise to propagate or
|
||||
modify it is void, and will automatically terminate your rights under
|
||||
this License (including any patent licenses granted under the third
|
||||
paragraph of section 11).
|
||||
|
||||
However, if you cease all violation of this License, then your license
|
||||
from a particular copyright holder is reinstated (a) provisionally,
|
||||
unless and until the copyright holder explicitly and finally
|
||||
terminates your license, and (b) permanently, if the copyright holder
|
||||
fails to notify you of the violation by some reasonable means prior to
|
||||
60 days after the cessation.
|
||||
|
||||
Moreover, your license from a particular copyright holder is
|
||||
reinstated permanently if the copyright holder notifies you of the
|
||||
violation by some reasonable means, this is the first time you have
|
||||
received notice of violation of this License (for any work) from that
|
||||
copyright holder, and you cure the violation prior to 30 days after
|
||||
your receipt of the notice.
|
||||
|
||||
Termination of your rights under this section does not terminate the
|
||||
licenses of parties who have received copies or rights from you under
|
||||
this License. If your rights have been terminated and not permanently
|
||||
reinstated, you do not qualify to receive new licenses for the same
|
||||
material under section 10.
|
||||
|
||||
#### 9. Acceptance Not Required for Having Copies.
|
||||
|
||||
You are not required to accept this License in order to receive or run
|
||||
a copy of the Program. Ancillary propagation of a covered work
|
||||
occurring solely as a consequence of using peer-to-peer transmission
|
||||
to receive a copy likewise does not require acceptance. However,
|
||||
nothing other than this License grants you permission to propagate or
|
||||
modify any covered work. These actions infringe copyright if you do
|
||||
not accept this License. Therefore, by modifying or propagating a
|
||||
covered work, you indicate your acceptance of this License to do so.
|
||||
|
||||
#### 10. Automatic Licensing of Downstream Recipients.
|
||||
|
||||
Each time you convey a covered work, the recipient automatically
|
||||
receives a license from the original licensors, to run, modify and
|
||||
propagate that work, subject to this License. You are not responsible
|
||||
for enforcing compliance by third parties with this License.
|
||||
|
||||
An "entity transaction" is a transaction transferring control of an
|
||||
organization, or substantially all assets of one, or subdividing an
|
||||
organization, or merging organizations. If propagation of a covered
|
||||
work results from an entity transaction, each party to that
|
||||
transaction who receives a copy of the work also receives whatever
|
||||
licenses to the work the party's predecessor in interest had or could
|
||||
give under the previous paragraph, plus a right to possession of the
|
||||
Corresponding Source of the work from the predecessor in interest, if
|
||||
the predecessor has it or can get it with reasonable efforts.
|
||||
|
||||
You may not impose any further restrictions on the exercise of the
|
||||
rights granted or affirmed under this License. For example, you may
|
||||
not impose a license fee, royalty, or other charge for exercise of
|
||||
rights granted under this License, and you may not initiate litigation
|
||||
(including a cross-claim or counterclaim in a lawsuit) alleging that
|
||||
any patent claim is infringed by making, using, selling, offering for
|
||||
sale, or importing the Program or any portion of it.
|
||||
|
||||
#### 11. Patents.
|
||||
|
||||
A "contributor" is a copyright holder who authorizes use under this
|
||||
License of the Program or a work on which the Program is based. The
|
||||
work thus licensed is called the contributor's "contributor version".
|
||||
|
||||
A contributor's "essential patent claims" are all patent claims owned
|
||||
or controlled by the contributor, whether already acquired or
|
||||
hereafter acquired, that would be infringed by some manner, permitted
|
||||
by this License, of making, using, or selling its contributor version,
|
||||
but do not include claims that would be infringed only as a
|
||||
consequence of further modification of the contributor version. For
|
||||
purposes of this definition, "control" includes the right to grant
|
||||
patent sublicenses in a manner consistent with the requirements of
|
||||
this License.
|
||||
|
||||
Each contributor grants you a non-exclusive, worldwide, royalty-free
|
||||
patent license under the contributor's essential patent claims, to
|
||||
make, use, sell, offer for sale, import and otherwise run, modify and
|
||||
propagate the contents of its contributor version.
|
||||
|
||||
In the following three paragraphs, a "patent license" is any express
|
||||
agreement or commitment, however denominated, not to enforce a patent
|
||||
(such as an express permission to practice a patent or covenant not to
|
||||
sue for patent infringement). To "grant" such a patent license to a
|
||||
party means to make such an agreement or commitment not to enforce a
|
||||
patent against the party.
|
||||
|
||||
If you convey a covered work, knowingly relying on a patent license,
|
||||
and the Corresponding Source of the work is not available for anyone
|
||||
to copy, free of charge and under the terms of this License, through a
|
||||
publicly available network server or other readily accessible means,
|
||||
then you must either (1) cause the Corresponding Source to be so
|
||||
available, or (2) arrange to deprive yourself of the benefit of the
|
||||
patent license for this particular work, or (3) arrange, in a manner
|
||||
consistent with the requirements of this License, to extend the patent
|
||||
license to downstream recipients. "Knowingly relying" means you have
|
||||
actual knowledge that, but for the patent license, your conveying the
|
||||
covered work in a country, or your recipient's use of the covered work
|
||||
in a country, would infringe one or more identifiable patents in that
|
||||
country that you have reason to believe are valid.
|
||||
|
||||
If, pursuant to or in connection with a single transaction or
|
||||
arrangement, you convey, or propagate by procuring conveyance of, a
|
||||
covered work, and grant a patent license to some of the parties
|
||||
receiving the covered work authorizing them to use, propagate, modify
|
||||
or convey a specific copy of the covered work, then the patent license
|
||||
you grant is automatically extended to all recipients of the covered
|
||||
work and works based on it.
|
||||
|
||||
A patent license is "discriminatory" if it does not include within the
|
||||
scope of its coverage, prohibits the exercise of, or is conditioned on
|
||||
the non-exercise of one or more of the rights that are specifically
|
||||
granted under this License. You may not convey a covered work if you
|
||||
are a party to an arrangement with a third party that is in the
|
||||
business of distributing software, under which you make payment to the
|
||||
third party based on the extent of your activity of conveying the
|
||||
work, and under which the third party grants, to any of the parties
|
||||
who would receive the covered work from you, a discriminatory patent
|
||||
license (a) in connection with copies of the covered work conveyed by
|
||||
you (or copies made from those copies), or (b) primarily for and in
|
||||
connection with specific products or compilations that contain the
|
||||
covered work, unless you entered into that arrangement, or that patent
|
||||
license was granted, prior to 28 March 2007.
|
||||
|
||||
Nothing in this License shall be construed as excluding or limiting
|
||||
any implied license or other defenses to infringement that may
|
||||
otherwise be available to you under applicable patent law.
|
||||
|
||||
#### 12. No Surrender of Others' Freedom.
|
||||
|
||||
If conditions are imposed on you (whether by court order, agreement or
|
||||
otherwise) that contradict the conditions of this License, they do not
|
||||
excuse you from the conditions of this License. If you cannot convey a
|
||||
covered work so as to satisfy simultaneously your obligations under
|
||||
this License and any other pertinent obligations, then as a
|
||||
consequence you may not convey it at all. For example, if you agree to
|
||||
terms that obligate you to collect a royalty for further conveying
|
||||
from those to whom you convey the Program, the only way you could
|
||||
satisfy both those terms and this License would be to refrain entirely
|
||||
from conveying the Program.
|
||||
|
||||
#### 13. Use with the GNU Affero General Public License.
|
||||
|
||||
Notwithstanding any other provision of this License, you have
|
||||
permission to link or combine any covered work with a work licensed
|
||||
under version 3 of the GNU Affero General Public License into a single
|
||||
combined work, and to convey the resulting work. The terms of this
|
||||
License will continue to apply to the part which is the covered work,
|
||||
but the special requirements of the GNU Affero General Public License,
|
||||
section 13, concerning interaction through a network will apply to the
|
||||
combination as such.
|
||||
|
||||
#### 14. Revised Versions of this License.
|
||||
|
||||
The Free Software Foundation may publish revised and/or new versions
|
||||
of the GNU General Public License from time to time. Such new versions
|
||||
will be similar in spirit to the present version, but may differ in
|
||||
detail to address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the Program
|
||||
specifies that a certain numbered version of the GNU General Public
|
||||
License "or any later version" applies to it, you have the option of
|
||||
following the terms and conditions either of that numbered version or
|
||||
of any later version published by the Free Software Foundation. If the
|
||||
Program does not specify a version number of the GNU General Public
|
||||
License, you may choose any version ever published by the Free
|
||||
Software Foundation.
|
||||
|
||||
If the Program specifies that a proxy can decide which future versions
|
||||
of the GNU General Public License can be used, that proxy's public
|
||||
statement of acceptance of a version permanently authorizes you to
|
||||
choose that version for the Program.
|
||||
|
||||
Later license versions may give you additional or different
|
||||
permissions. However, no additional obligations are imposed on any
|
||||
author or copyright holder as a result of your choosing to follow a
|
||||
later version.
|
||||
|
||||
#### 15. Disclaimer of Warranty.
|
||||
|
||||
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
|
||||
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
|
||||
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT
|
||||
WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT
|
||||
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND
|
||||
PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE
|
||||
DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR
|
||||
CORRECTION.
|
||||
|
||||
#### 16. Limitation of Liability.
|
||||
|
||||
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
||||
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR
|
||||
CONVEYS THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
|
||||
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES
|
||||
ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT
|
||||
NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR
|
||||
LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM
|
||||
TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER
|
||||
PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
|
||||
|
||||
#### 17. Interpretation of Sections 15 and 16.
|
||||
|
||||
If the disclaimer of warranty and limitation of liability provided
|
||||
above cannot be given local legal effect according to their terms,
|
||||
reviewing courts shall apply local law that most closely approximates
|
||||
an absolute waiver of all civil liability in connection with the
|
||||
Program, unless a warranty or assumption of liability accompanies a
|
||||
copy of the Program in return for a fee.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
### How to Apply These Terms to Your New Programs
|
||||
|
||||
If you develop a new program, and you want it to be of the greatest
|
||||
possible use to the public, the best way to achieve this is to make it
|
||||
free software which everyone can redistribute and change under these
|
||||
terms.
|
||||
|
||||
To do so, attach the following notices to the program. It is safest to
|
||||
attach them to the start of each source file to most effectively state
|
||||
the exclusion of warranty; and each file should have at least the
|
||||
"copyright" line and a pointer to where the full notice is found.
|
||||
|
||||
<one line to give the program's name and a brief idea of what it does.>
|
||||
Copyright (C) <year> <name of author>
|
||||
|
||||
This program is free software: you can redistribute it and/or modify
|
||||
it under the terms of the GNU General Public License as published by
|
||||
the Free Software Foundation, either version 3 of the License, or
|
||||
(at your option) any later version.
|
||||
|
||||
This program is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU General Public License
|
||||
along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
|
||||
Also add information on how to contact you by electronic and paper
|
||||
mail.
|
||||
|
||||
If the program does terminal interaction, make it output a short
|
||||
notice like this when it starts in an interactive mode:
|
||||
|
||||
<program> Copyright (C) <year> <name of author>
|
||||
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
|
||||
This is free software, and you are welcome to redistribute it
|
||||
under certain conditions; type `show c' for details.
|
||||
|
||||
The hypothetical commands \`show w' and \`show c' should show the
|
||||
appropriate parts of the General Public License. Of course, your
|
||||
program's commands might be different; for a GUI interface, you would
|
||||
use an "about box".
|
||||
|
||||
You should also get your employer (if you work as a programmer) or
|
||||
school, if any, to sign a "copyright disclaimer" for the program, if
|
||||
necessary. For more information on this, and how to apply and follow
|
||||
the GNU GPL, see <https://www.gnu.org/licenses/>.
|
||||
|
||||
The GNU General Public License does not permit incorporating your
|
||||
program into proprietary programs. If your program is a subroutine
|
||||
library, you may consider it more useful to permit linking proprietary
|
||||
applications with the library. If this is what you want to do, use the
|
||||
GNU Lesser General Public License instead of this License. But first,
|
||||
please read <https://www.gnu.org/licenses/why-not-lgpl.html>.
|
533
DOCUMENTATION.md
533
DOCUMENTATION.md
File diff suppressed because it is too large
Load diff
22
Dockerfile
22
Dockerfile
|
@ -1,19 +1,17 @@
|
|||
# This file is intended to be used apart from the containing source code tree.
|
||||
|
||||
FROM python:3-alpine
|
||||
|
||||
# Version of Radicale (e.g. 3.0.x)
|
||||
# Version of Radicale (e.g. v3)
|
||||
ARG VERSION=master
|
||||
# Persistent storage for data (Mount it somewhere on the host!)
|
||||
# Persistent storage for data
|
||||
VOLUME /var/lib/radicale
|
||||
# Configuration data (Put the "config" file here!)
|
||||
VOLUME /etc/radicale
|
||||
# TCP port of Radicale (Publish it on a host interface!)
|
||||
# TCP port of Radicale
|
||||
EXPOSE 5232
|
||||
# Run Radicale (Configure it here or provide a "config" file!)
|
||||
# Run Radicale
|
||||
CMD ["radicale", "--hosts", "0.0.0.0:5232"]
|
||||
|
||||
# Install dependencies
|
||||
RUN apk add --no-cache gcc musl-dev libffi-dev ca-certificates openssl
|
||||
# Install Radicale
|
||||
RUN pip install --no-cache-dir "Radicale[bcrypt] @ https://github.com/Kozea/Radicale/archive/${VERSION}.tar.gz"
|
||||
# Remove build dependencies
|
||||
RUN apk del gcc musl-dev libffi-dev
|
||||
RUN apk add --no-cache ca-certificates openssl \
|
||||
&& apk add --no-cache --virtual .build-deps gcc libffi-dev musl-dev \
|
||||
&& pip install --no-cache-dir "Radicale[bcrypt] @ https://github.com/Kozea/Radicale/archive/${VERSION}.tar.gz" \
|
||||
&& apk del .build-deps
|
||||
|
|
|
@ -1,3 +1,3 @@
|
|||
include COPYING DOCUMENTATION.md NEWS.md README.md
|
||||
include CHANGELOG.md COPYING.md DOCUMENTATION.md README.md
|
||||
include config rights
|
||||
include radicale.py radicale.fcgi radicale.wsgi
|
||||
include radicale.wsgi
|
||||
|
|
478
NEWS.md
478
NEWS.md
|
@ -1,478 +0,0 @@
|
|||
# News
|
||||
|
||||
## master
|
||||
|
||||
## 3.0.3
|
||||
|
||||
* Fix internal server on OpenBSD
|
||||
|
||||
## 3.0.2
|
||||
|
||||
* Use 403 response for supported-report and valid-sync-token errors
|
||||
* Internal server: Handle missing IPv6 support
|
||||
|
||||
## 3.0.1
|
||||
|
||||
* Fix XML error messages
|
||||
|
||||
## 3.0.0
|
||||
|
||||
This release is incompatible with previous releases.
|
||||
See the upgrade checklist below.
|
||||
|
||||
* Parallel write requests
|
||||
* Support PyPy
|
||||
* Protect against XML denial-of-service attacks
|
||||
* Check for duplicated UIDs in calendars/address books
|
||||
* Only add missing UIDs for uploaded whole calendars/address books
|
||||
* Switch from md5 to sha256 for UIDs and tokens
|
||||
* Code cleanup:
|
||||
* All plugin interfaces were simplified and are incompatible with
|
||||
old plugins
|
||||
* Major refactor
|
||||
* Never sanitize paths multiple times (check if they are sanitized)
|
||||
* Config
|
||||
* Multiple configuration files separated by ``:`` (resp. ``;``
|
||||
on Windows)
|
||||
* Optional configuration files by prepending file path with ``?``
|
||||
* Check validity of every configuration file and command line
|
||||
arguments separately
|
||||
* Report the source of invalid configuration parameters in
|
||||
error messages
|
||||
* Code cleanup:
|
||||
* Store configuration as parsed values
|
||||
* Use Schema that describes configuration and allow plugins to apply
|
||||
their own schemas
|
||||
* Mark internal settings with ``_``
|
||||
* Internal server
|
||||
* Bind to IPv4 and IPv6 address, when both are available for hostname
|
||||
* Set default address to ``localhost:5232``
|
||||
* Remove settings for SSL ciphers and protocol versions (enforce safe
|
||||
defaults instead)
|
||||
* Remove settings for file locking because they are of little use
|
||||
* Remove daemonization (should be handled by service managers)
|
||||
* Logging
|
||||
* Replace complex Python logger configuration with simple
|
||||
``logging.level`` setting
|
||||
* Write PID and ``threadName`` instead of cryptic id's in log messages
|
||||
* Use ``wsgi.errors`` for logging (as required by the WSGI spec)
|
||||
* Code cleanup:
|
||||
* Don't pass logger object around (use ``logging.getLogger()``
|
||||
instead)
|
||||
* Auth
|
||||
* Use ``md5`` as default for ``htpasswd_encryption`` setting
|
||||
* Move setting ``realm`` from section ``server`` to ``auth``
|
||||
* Rights
|
||||
* Use permissions ``RW`` for non-leaf collections and ``rw`` for
|
||||
address books/calendars
|
||||
* New permission ``i`` that only allows access with HTTP method GET
|
||||
(CalDAV/CardDAV is susceptible to expensive search requests)
|
||||
* Web
|
||||
* Add upload dialog for calendars/address books from file
|
||||
* Show startup loading message
|
||||
* Show warning if JavaScript is disabled
|
||||
* Pass HTML Validator
|
||||
* Storage
|
||||
* Check for missing UIDs in items
|
||||
* Check for child collections in address books and calendars
|
||||
* Code cleanup:
|
||||
* Split BaseCollection in BaseStorage and BaseCollection
|
||||
|
||||
## Upgrade checklist
|
||||
|
||||
* Config
|
||||
* Some settings were removed
|
||||
* The default of ``auth.htpasswd_encryption`` changed to ``md5``
|
||||
* The setting ``server.realm`` moved to ``auth.realm``
|
||||
* The setting ``logging.debug`` was replaced by ``logging.level``
|
||||
* The format of the ``rights.file`` configuration file changed:
|
||||
* Permission ``r`` replaced by ``Rr``
|
||||
* Permission ``w`` replaced by ``Ww``
|
||||
* New permission ``i`` added as subset of ``r``
|
||||
* Replaced variable ``%(login)s`` by ``{user}``
|
||||
* Removed variable ``%(path)s``
|
||||
* ``{`` must be escaped as ``{{`` and ``}`` as ``}}`` in regexes
|
||||
* File system storage
|
||||
* The storage format is compatible with Radicale 2.x.x
|
||||
* Run ``radicale --verify-storage`` to check for errors
|
||||
* Custom plugins:
|
||||
* ``auth`` and ``web`` plugins require minor adjustments
|
||||
* ``rights`` plugins must be adapted to the new permission model
|
||||
* ``storage`` plugins require major changes
|
||||
|
||||
## 2.1.10 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Update required versions for dependencies
|
||||
* Get ``RADICALE_CONFIG`` from WSGI environ
|
||||
* Improve HTTP status codes
|
||||
* Fix race condition in storage lock creation
|
||||
* Raise default limits for content length and timeout
|
||||
* Log output from hook
|
||||
|
||||
## 2.1.9 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Specify versions for dependencies
|
||||
* Move WSGI initialization into module
|
||||
* Check if ``REPORT`` method is actually supported
|
||||
* Include ``rights`` file in source distribution
|
||||
* Specify ``md5`` and ``bcrypt`` as extras
|
||||
* Improve logging messages
|
||||
* Windows: Fix crash when item path is a directory
|
||||
|
||||
## 2.1.8 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Flush files before fsync'ing
|
||||
|
||||
## 2.1.7 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Don't print warning when cache format changes
|
||||
* Add documentation for ``BaseAuth``
|
||||
* Add ``is_authenticated2(login, user, password)`` to ``BaseAuth``
|
||||
* Fix names of custom properties in PROPFIND requests with
|
||||
``D:propname`` or ``D:allprop``
|
||||
* Return all properties in PROPFIND requests with ``D:propname`` or
|
||||
``D:allprop``
|
||||
* Allow ``D:displayname`` property on all collections
|
||||
* Answer with ``D:unauthenticated`` for ``D:current-user-principal`` property
|
||||
when not logged in
|
||||
* Remove non-existing ``ICAL:calendar-color`` and ``C:calendar-timezone``
|
||||
properties from PROPFIND requests with ``D:propname`` or ``D:allprop``
|
||||
* Add ``D:owner`` property to calendar and address book objects
|
||||
* Remove ``D:getetag`` and ``D:getlastmodified`` properties from regular
|
||||
collections
|
||||
|
||||
|
||||
## 2.1.6 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Fix content-type of VLIST
|
||||
* Specify correct COMPONENT in content-type of VCALENDAR
|
||||
* Cache COMPONENT of calendar objects (improves speed with some clients)
|
||||
* Stricter parsing of filters
|
||||
* Improve support for CardDAV filter
|
||||
* Fix some smaller bugs in CalDAV filter
|
||||
* Add X-WR-CALNAME and X-WR-CALDESC to calendars downloaded via HTTP/WebDAV
|
||||
* Use X-WR-CALNAME and X-WR-CALDESC from calendars published via WebDAV
|
||||
|
||||
## 2.1.5 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Add ``--verify-storage`` command-line argument
|
||||
* Allow comments in the htpasswd file
|
||||
* Don't strip whitespaces from user names and passwords in the htpasswd file
|
||||
* Remove cookies from logging output
|
||||
* Allow uploads of whole collections with many components
|
||||
* Show warning message if server.timeout is used with Python < 3.5.2
|
||||
|
||||
## 2.1.4 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Fix incorrect time range matching and calculation for some edge-cases with
|
||||
rescheduled recurrences
|
||||
* Fix owner property
|
||||
|
||||
## 2.1.3 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Enable timeout for SSL handshakes and move them out of the main thread
|
||||
* Create cache entries during upload of items
|
||||
* Stop built-in server on Windows when Ctrl+C is pressed
|
||||
* Prevent slow down when multiple requests hit a collection during cache warm-up
|
||||
|
||||
## 2.1.2 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Remove workarounds for bugs in VObject < 0.9.5
|
||||
* Error checking of collection tags and associated components
|
||||
* Improve error checking of uploaded collections and components
|
||||
* Don't delete empty collection properties implicitly
|
||||
* Improve logging of VObject serialization
|
||||
|
||||
## 2.1.1 - Wild Radish Again
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Add missing UIDs instead of failing
|
||||
* Improve error checking of calendar and address book objects
|
||||
* Fix upload of whole address books
|
||||
|
||||
## 2.1.0 - Wild Radish
|
||||
|
||||
This release is compatible with version 2.0.0.
|
||||
|
||||
* Built-in web interface for creating and managing address books and calendars
|
||||
* can be extended with web plugins
|
||||
* Much faster storage backend
|
||||
* Significant reduction in memory usage
|
||||
* Improved logging
|
||||
* Include paths (of invalid items / requests) in log messages
|
||||
* Include configuration values causing problems in log messages
|
||||
* Log warning message for invalid requests by clients
|
||||
* Log error message for invalid files in the storage backend
|
||||
* No stack traces unless debugging is enabled
|
||||
* Time range filter also regards overwritten recurrences
|
||||
* Items that couldn't be filtered because of bugs in VObject are always
|
||||
returned (and a warning message is logged)
|
||||
* Basic error checking of configuration files
|
||||
* File system locking isn't disabled implicitly anymore, instead a new
|
||||
configuration option gets introduced
|
||||
* The permissions of the lock file are not changed anymore
|
||||
* Support for sync-token
|
||||
* Support for client-side SSL certificates
|
||||
* Rights plugins can decide if access to an item is granted explicitly
|
||||
* Respond with 403 instead of 404 for principal collections of non-existing
|
||||
users when ``owner_only`` plugin is used (information leakage)
|
||||
* Authentication plugins can provide the login and password from the
|
||||
environment
|
||||
* new ``remote_user`` plugin, that gets the login from the ``REMOTE_USER``
|
||||
environment variable (for WSGI server)
|
||||
* new ``http_x_remote_user`` plugin, that gets the login from the
|
||||
``X-Remote-User`` HTTP header (for reverse proxies)
|
||||
|
||||
|
||||
## 2.0.0 - Little Big Radish
|
||||
|
||||
This feature is not compatible with the 1.x.x versions. Follow our
|
||||
[migration guide](https://radicale.org/2.1.html#documentation/migration-from-1xx-to-2xx) if you want to switch from 1.x.x to
|
||||
2.0.0.
|
||||
|
||||
* Support Python 3.3+ only, Python 2 is not supported anymore
|
||||
* Keep only one simple filesystem-based storage system
|
||||
* Remove built-in Git support
|
||||
* Remove built-in authentication modules
|
||||
* Keep the WSGI interface, use Python HTTP server by default
|
||||
* Use a real iCal parser, rely on the "vobject" external module
|
||||
* Add a solid calendar discovery
|
||||
* Respect the difference between "files" and "folders", don't rely on slashes
|
||||
* Remove the calendar creation with GET requests
|
||||
* Be stateless
|
||||
* Use a file locker
|
||||
* Add threading
|
||||
* Get atomic writes
|
||||
* Support new filters
|
||||
* Support read-only permissions
|
||||
* Allow External plugins for authentication, rights management, storage and
|
||||
version control
|
||||
|
||||
|
||||
## 1.1.4 - Fifth Law of Nature
|
||||
|
||||
* Use ``shutil.move`` for ``--export-storage``
|
||||
|
||||
|
||||
## 1.1.3 - Fourth Law of Nature
|
||||
|
||||
* Add a ``--export-storage=FOLDER`` command-line argument (by Unrud, see #606)
|
||||
|
||||
|
||||
## 1.1.2 - Third Law of Nature
|
||||
|
||||
* **Security fix**: Add a random timer to avoid timing oracles and simple
|
||||
bruteforce attacks when using the htpasswd authentication method.
|
||||
* Various minor fixes.
|
||||
|
||||
|
||||
## 1.1.1 - Second Law of Nature
|
||||
|
||||
* Fix the owner_write rights rule
|
||||
|
||||
|
||||
## 1.1 - Law of Nature
|
||||
|
||||
One feature in this release is **not backward compatible**:
|
||||
|
||||
* Use the first matching section for rights (inspired from daald)
|
||||
|
||||
Now, the first section matching the path and current user in your custom rights
|
||||
file is used. In the previous versions, the most permissive rights of all the
|
||||
matching sections were applied. This new behaviour gives a simple way to make
|
||||
specific rules at the top of the file independant from the generic ones.
|
||||
|
||||
Many **improvements in this release are related to security**, you should
|
||||
upgrade Radicale as soon as possible:
|
||||
|
||||
* Improve the regex used for well-known URIs (by Unrud)
|
||||
* Prevent regex injection in rights management (by Unrud)
|
||||
* Prevent crafted HTTP request from calling arbitrary functions (by Unrud)
|
||||
* Improve URI sanitation and conversion to filesystem path (by Unrud)
|
||||
* Decouple the daemon from its parent environment (by Unrud)
|
||||
|
||||
Some bugs have been fixed and little enhancements have been added:
|
||||
|
||||
* Assign new items to corret key (by Unrud)
|
||||
* Avoid race condition in PID file creation (by Unrud)
|
||||
* Improve the docker version (by cdpb)
|
||||
* Encode message and commiter for git commits
|
||||
* Test with Python 3.5
|
||||
|
||||
|
||||
## 1.0.1 - Sunflower Again
|
||||
|
||||
* Update the version because of a **stupid** "feature"™ of PyPI
|
||||
|
||||
|
||||
## 1.0 - Sunflower
|
||||
|
||||
* Enhanced performances (by Mathieu Dupuy)
|
||||
* Add MD5-APR1 and BCRYPT for htpasswd-based authentication (by Jan-Philip Gehrcke)
|
||||
* Use PAM service (by Stephen Paul Weber)
|
||||
* Don't discard PROPPATCH on empty collections (by Markus Unterwaditzer)
|
||||
* Write the path of the collection in the git message (by Matthew Monaco)
|
||||
* Tests launched on Travis
|
||||
|
||||
|
||||
## 0.10 - Lovely Endless Grass
|
||||
|
||||
* Support well-known URLs (by Mathieu Dupuy)
|
||||
* Fix collection discovery (by Markus Unterwaditzer)
|
||||
* Reload logger config on SIGHUP (by Élie Bouttier)
|
||||
* Remove props files when deleting a collection (by Vincent Untz)
|
||||
* Support salted SHA1 passwords (by Marc Kleine-Budde)
|
||||
* Don't spam the logs about non-SSL IMAP connections to localhost (by Giel van Schijndel)
|
||||
|
||||
|
||||
## 0.9 - Rivers
|
||||
|
||||
* Custom handlers for auth, storage and rights (by Sergey Fursov)
|
||||
* 1-file-per-event storage (by Jean-Marc Martins)
|
||||
* Git support for filesystem storages (by Jean-Marc Martins)
|
||||
* DB storage working with PostgreSQL, MariaDB and SQLite (by Jean-Marc Martins)
|
||||
* Clean rights manager based on regular expressions (by Sweil)
|
||||
* Support of contacts for Apple's clients
|
||||
* Support colors (by Jochen Sprickerhof)
|
||||
* Decode URLs in XML (by Jean-Marc Martins)
|
||||
* Fix PAM authentication (by Stepan Henek)
|
||||
* Use consistent etags (by 9m66p93w)
|
||||
* Use consistent sorting order (by Daniel Danner)
|
||||
* Return 401 on unauthorized DELETE requests (by Eduard Braun)
|
||||
* Move pid file creation in child process (by Mathieu Dupuy)
|
||||
* Allow requests without base_prefix (by jheidemann)
|
||||
|
||||
|
||||
## 0.8 - Rainbow
|
||||
|
||||
* New authentication and rights management modules (by Matthias Jordan)
|
||||
* Experimental database storage
|
||||
* Command-line option for custom configuration file (by Mark Adams)
|
||||
* Root URL not at the root of a domain (by Clint Adams, Fabrice Bellet, Vincent Untz)
|
||||
* Improved support for iCal, CalDAVSync, CardDAVSync, CalDavZAP and CardDavMATE
|
||||
* Empty PROPFIND requests handled (by Christoph Polcin)
|
||||
* Colon allowed in passwords
|
||||
* Configurable realm message
|
||||
|
||||
|
||||
## 0.7.1 - Waterfalls
|
||||
|
||||
* Many address books fixes
|
||||
* New IMAP ACL (by Daniel Aleksandersen)
|
||||
* PAM ACL fixed (by Daniel Aleksandersen)
|
||||
* Courier ACL fixed (by Benjamin Frank)
|
||||
* Always set display name to collections (by Oskari Timperi)
|
||||
* Various DELETE responses fixed
|
||||
|
||||
|
||||
## 0.7 - Eternal Sunshine
|
||||
|
||||
* Repeating events
|
||||
* Collection deletion
|
||||
* Courier and PAM authentication methods
|
||||
* CardDAV support
|
||||
* Custom LDAP filters supported
|
||||
|
||||
|
||||
## 0.6.4 - Tulips
|
||||
|
||||
* Fix the installation with Python 3.1
|
||||
|
||||
|
||||
## 0.6.3 - Red Roses
|
||||
|
||||
* MOVE requests fixed
|
||||
* Faster REPORT answers
|
||||
* Executable script moved into the package
|
||||
|
||||
|
||||
## 0.6.2 - Seeds
|
||||
|
||||
* iPhone and iPad support fixed
|
||||
* Backslashes replaced by slashes in PROPFIND answers on Windows
|
||||
* PyPI archive set as default download URL
|
||||
|
||||
|
||||
## 0.6.1 - Growing Up
|
||||
|
||||
* Example files included in the tarball
|
||||
* htpasswd support fixed
|
||||
* Redirection loop bug fixed
|
||||
* Testing message on GET requests
|
||||
|
||||
|
||||
## 0.6 - Sapling
|
||||
|
||||
* WSGI support
|
||||
* IPv6 support
|
||||
* Smart, verbose and configurable logs
|
||||
* Apple iCal 4 and iPhone support (by Łukasz Langa)
|
||||
* KDE KOrganizer support
|
||||
* LDAP auth backend (by Corentin Le Bail)
|
||||
* Public and private calendars (by René Neumann)
|
||||
* PID file
|
||||
* MOVE requests management
|
||||
* Journal entries support
|
||||
* Drop Python 2.5 support
|
||||
|
||||
|
||||
## 0.5 - Historical Artifacts
|
||||
|
||||
* Calendar depth
|
||||
* MacOS and Windows support
|
||||
* HEAD requests management
|
||||
* htpasswd user from calendar path
|
||||
|
||||
|
||||
## 0.4 - Hot Days Back
|
||||
|
||||
* Personal calendars
|
||||
* Last-Modified HTTP header
|
||||
* ``no-ssl`` and ``foreground`` options
|
||||
* Default configuration file
|
||||
|
||||
|
||||
## 0.3 - Dancing Flowers
|
||||
|
||||
* Evolution support
|
||||
* Version management
|
||||
|
||||
|
||||
## 0.2 - Snowflakes
|
||||
|
||||
* Sunbird pre-1.0 support
|
||||
* SSL connection
|
||||
* Htpasswd authentication
|
||||
* Daemon mode
|
||||
* User configuration
|
||||
* Twisted dependency removed
|
||||
* Python 3 support
|
||||
* Real URLs for PUT and DELETE
|
||||
* Concurrent modification reported to users
|
||||
* Many bugs fixed (by Roger Wenham)
|
||||
|
||||
|
||||
## 0.1 - Crazy Vegetables
|
||||
|
||||
* First release
|
||||
* Lightning/Sunbird 0.9 compatibility
|
||||
* Easy installer
|
19
README.md
19
README.md
|
@ -1,9 +1,20 @@
|
|||
# Read Me
|
||||
# Radicale
|
||||
|
||||

|
||||
[](https://github.com/Kozea/Radicale/actions/workflows/test.yml)
|
||||
[](https://coveralls.io/github/Kozea/Radicale?branch=master)
|
||||
|
||||
Radicale is a free and open-source CalDAV and CardDAV server.
|
||||
Radicale is a small but powerful CalDAV (calendars, to-do lists) and CardDAV
|
||||
(contacts) server, that:
|
||||
|
||||
* Shares calendars and contact lists through CalDAV, CardDAV and HTTP.
|
||||
* Supports events, todos, journal entries and business cards.
|
||||
* Works out-of-the-box, no complicated setup or configuration required.
|
||||
* Can limit access by authentication.
|
||||
* Can secure connections with TLS.
|
||||
* Works with many CalDAV and CardDAV clients
|
||||
* Stores all data on the file system in a simple folder structure.
|
||||
* Can be extended with plugins.
|
||||
* Is GPLv3-licensed free software.
|
||||
|
||||
For the complete documentation, please visit
|
||||
[Radicale "master" Documentation](https://radicale.org/master.html).
|
||||
[Radicale master Documentation](https://radicale.org/master.html).
|
||||
|
|
12
config
12
config
|
@ -83,7 +83,7 @@
|
|||
[storage]
|
||||
|
||||
# Storage backend
|
||||
# Value: multifilesystem
|
||||
# Value: multifilesystem | multifilesystem_nolock
|
||||
#type = multifilesystem
|
||||
|
||||
# Folder for storing local collections, created if not present
|
||||
|
@ -121,8 +121,8 @@
|
|||
|
||||
[hook]
|
||||
|
||||
# Hook types
|
||||
# Value: none | rabbitmq
|
||||
#type = none
|
||||
#rabbitmq_endpoint =
|
||||
#rabbitmq_topic =
|
||||
# Hook types
|
||||
# Value: none | rabbitmq
|
||||
#type = none
|
||||
#rabbitmq_endpoint =
|
||||
#rabbitmq_topic =
|
|
@ -1,17 +0,0 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
"""
|
||||
Radicale FastCGI Example.
|
||||
|
||||
Launch a Radicale FastCGI server according to configuration.
|
||||
|
||||
This script relies on flup but can be easily adapted to use another
|
||||
WSGI-to-FastCGI mapper.
|
||||
|
||||
"""
|
||||
|
||||
from flup.server.fcgi import WSGIServer
|
||||
from radicale import application
|
||||
|
||||
if __name__ == "__main__":
|
||||
WSGIServer(application).run()
|
13
radicale.py
13
radicale.py
|
@ -1,13 +0,0 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
"""
|
||||
Radicale CalDAV Server.
|
||||
|
||||
Launch the server according to configuration and command-line options.
|
||||
|
||||
"""
|
||||
|
||||
import runpy
|
||||
|
||||
if __name__ == "__main__":
|
||||
runpy.run_module("radicale", run_name="__main__")
|
0
radicale.wsgi
Executable file → Normal file
0
radicale.wsgi
Executable file → Normal file
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -27,46 +27,46 @@ Configuration files can be specified in the environment variable
|
|||
|
||||
import os
|
||||
import threading
|
||||
from typing import Iterable, Optional, cast
|
||||
|
||||
import pkg_resources
|
||||
|
||||
from radicale import config, log
|
||||
from radicale import config, log, types, utils
|
||||
from radicale.app import Application
|
||||
from radicale.log import logger
|
||||
|
||||
VERSION = pkg_resources.get_distribution("radicale").version
|
||||
VERSION: str = utils.package_version("radicale")
|
||||
|
||||
_application = None
|
||||
_application_config_path = None
|
||||
_application_instance: Optional[Application] = None
|
||||
_application_config_path: Optional[str] = None
|
||||
_application_lock = threading.Lock()
|
||||
|
||||
|
||||
def _init_application(config_path, wsgi_errors):
|
||||
global _application, _application_config_path
|
||||
def _get_application_instance(config_path: str, wsgi_errors: types.ErrorStream
|
||||
) -> Application:
|
||||
global _application_instance, _application_config_path
|
||||
with _application_lock:
|
||||
if _application is not None:
|
||||
return
|
||||
log.setup()
|
||||
with log.register_stream(wsgi_errors):
|
||||
_application_config_path = config_path
|
||||
configuration = config.load(config.parse_compound_paths(
|
||||
config.DEFAULT_CONFIG_PATH,
|
||||
config_path))
|
||||
log.set_level(configuration.get("logging", "level"))
|
||||
# Log configuration after logger is configured
|
||||
for source, miss in configuration.sources():
|
||||
logger.info("%s %s", "Skipped missing" if miss else "Loaded",
|
||||
source)
|
||||
_application = Application(configuration)
|
||||
if _application_instance is None:
|
||||
log.setup()
|
||||
with log.register_stream(wsgi_errors):
|
||||
_application_config_path = config_path
|
||||
configuration = config.load(config.parse_compound_paths(
|
||||
config.DEFAULT_CONFIG_PATH,
|
||||
config_path))
|
||||
log.set_level(cast(str, configuration.get("logging", "level")))
|
||||
# Log configuration after logger is configured
|
||||
for source, miss in configuration.sources():
|
||||
logger.info("%s %s", "Skipped missing" if miss
|
||||
else "Loaded", source)
|
||||
_application_instance = Application(configuration)
|
||||
if _application_config_path != config_path:
|
||||
raise ValueError("RADICALE_CONFIG must not change: %r != %r" %
|
||||
(config_path, _application_config_path))
|
||||
return _application_instance
|
||||
|
||||
|
||||
def application(environ, start_response):
|
||||
def application(environ: types.WSGIEnviron,
|
||||
start_response: types.WSGIStartResponse) -> Iterable[bytes]:
|
||||
"""Entry point for external WSGI servers."""
|
||||
config_path = environ.get("RADICALE_CONFIG",
|
||||
os.environ.get("RADICALE_CONFIG"))
|
||||
if _application is None:
|
||||
_init_application(config_path, environ["wsgi.errors"])
|
||||
if _application_config_path != config_path:
|
||||
raise ValueError("RADICALE_CONFIG must not change: %s != %s" %
|
||||
(repr(config_path), repr(_application_config_path)))
|
||||
return _application(environ, start_response)
|
||||
app = _get_application_instance(config_path, environ["wsgi.errors"])
|
||||
return app(environ, start_response)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2011-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2019 Unrud <unrud@outlook.com>
|
||||
#
|
||||
|
@ -29,128 +29,176 @@ import os
|
|||
import signal
|
||||
import socket
|
||||
import sys
|
||||
from typing import List, Optional, cast
|
||||
|
||||
from radicale import VERSION, config, log, server, storage
|
||||
from radicale import VERSION, config, log, server, storage, types
|
||||
from radicale.log import logger
|
||||
from types import FrameType
|
||||
|
||||
|
||||
def run():
|
||||
def run() -> None:
|
||||
"""Run Radicale as a standalone server."""
|
||||
exit_signal_numbers = [signal.SIGTERM, signal.SIGINT]
|
||||
if sys.platform == "win32":
|
||||
exit_signal_numbers.append(signal.SIGBREAK)
|
||||
else:
|
||||
exit_signal_numbers.append(signal.SIGHUP)
|
||||
exit_signal_numbers.append(signal.SIGQUIT)
|
||||
|
||||
# Raise SystemExit when signal arrives to run cleanup code
|
||||
# (like destructors, try-finish etc.), otherwise the process exits
|
||||
# without running any of them
|
||||
def exit_signal_handler(signal_number: int,
|
||||
stack_frame: Optional[FrameType]) -> None:
|
||||
sys.exit(1)
|
||||
for signal_number in exit_signal_numbers:
|
||||
signal.signal(signal_number, exit_signal_handler)
|
||||
|
||||
log.setup()
|
||||
|
||||
# Get command-line arguments
|
||||
parser = argparse.ArgumentParser(usage="radicale [OPTIONS]")
|
||||
# Configuration options are stored in dest with format "c:SECTION:OPTION"
|
||||
parser = argparse.ArgumentParser(
|
||||
prog="radicale", usage="%(prog)s [OPTIONS]", allow_abbrev=False)
|
||||
|
||||
parser.add_argument("--version", action="version", version=VERSION)
|
||||
parser.add_argument("--verify-storage", action="store_true",
|
||||
help="check the storage for errors and exit")
|
||||
parser.add_argument(
|
||||
"-C", "--config", help="use specific configuration files", nargs="*")
|
||||
parser.add_argument("-D", "--debug", action="store_true",
|
||||
parser.add_argument("-C", "--config",
|
||||
help="use specific configuration files", nargs="*")
|
||||
parser.add_argument("-D", "--debug", action="store_const", const="debug",
|
||||
dest="c:logging:level", default=argparse.SUPPRESS,
|
||||
help="print debug information")
|
||||
|
||||
groups = {}
|
||||
for section, values in config.DEFAULT_CONFIG_SCHEMA.items():
|
||||
for section, section_data in config.DEFAULT_CONFIG_SCHEMA.items():
|
||||
if section.startswith("_"):
|
||||
continue
|
||||
group = parser.add_argument_group(section)
|
||||
groups[group] = []
|
||||
for option, data in values.items():
|
||||
assert ":" not in section # check field separator
|
||||
assert "-" not in section and "_" not in section # not implemented
|
||||
group_description = None
|
||||
if section_data.get("_allow_extra"):
|
||||
group_description = "additional options allowed"
|
||||
if section == "headers":
|
||||
group_description += " (e.g. --headers-Pragma=no-cache)"
|
||||
elif "type" in section_data:
|
||||
group_description = "backend specific options omitted"
|
||||
group = parser.add_argument_group(section, group_description)
|
||||
for option, data in section_data.items():
|
||||
if option.startswith("_"):
|
||||
continue
|
||||
kwargs = data.copy()
|
||||
long_name = "--%s-%s" % (section, option.replace("_", "-"))
|
||||
args = kwargs.pop("aliases", [])
|
||||
args: List[str] = list(kwargs.pop("aliases", ()))
|
||||
args.append(long_name)
|
||||
kwargs["dest"] = "%s_%s" % (section, option)
|
||||
groups[group].append(kwargs["dest"])
|
||||
kwargs["dest"] = "c:%s:%s" % (section, option)
|
||||
kwargs["metavar"] = "VALUE"
|
||||
kwargs["default"] = argparse.SUPPRESS
|
||||
del kwargs["value"]
|
||||
with contextlib.suppress(KeyError):
|
||||
del kwargs["internal"]
|
||||
|
||||
if kwargs["type"] == bool:
|
||||
del kwargs["type"]
|
||||
kwargs["action"] = "store_const"
|
||||
kwargs["const"] = "True"
|
||||
opposite_args = kwargs.pop("opposite", [])
|
||||
opposite_args = list(kwargs.pop("opposite_aliases", ()))
|
||||
opposite_args.append("--no%s" % long_name[1:])
|
||||
group.add_argument(*args, **kwargs)
|
||||
|
||||
kwargs["const"] = "False"
|
||||
group.add_argument(*args, nargs="?", const="True", **kwargs)
|
||||
# Opposite argument
|
||||
kwargs["help"] = "do not %s (opposite of %s)" % (
|
||||
kwargs["help"], long_name)
|
||||
group.add_argument(*opposite_args, **kwargs)
|
||||
group.add_argument(*opposite_args, action="store_const",
|
||||
const="False", **kwargs)
|
||||
else:
|
||||
del kwargs["type"]
|
||||
group.add_argument(*args, **kwargs)
|
||||
|
||||
args = parser.parse_args()
|
||||
args_ns, remaining_args = parser.parse_known_args()
|
||||
unrecognized_args = []
|
||||
while remaining_args:
|
||||
arg = remaining_args.pop(0)
|
||||
for section, data in config.DEFAULT_CONFIG_SCHEMA.items():
|
||||
if "type" not in data and not data.get("_allow_extra"):
|
||||
continue
|
||||
prefix = "--%s-" % section
|
||||
if arg.startswith(prefix):
|
||||
arg = arg[len(prefix):]
|
||||
break
|
||||
else:
|
||||
unrecognized_args.append(arg)
|
||||
continue
|
||||
value = ""
|
||||
if "=" in arg:
|
||||
arg, value = arg.split("=", maxsplit=1)
|
||||
elif remaining_args and not remaining_args[0].startswith("-"):
|
||||
value = remaining_args.pop(0)
|
||||
option = arg
|
||||
if not data.get("_allow_extra"): # preserve dash in HTTP header names
|
||||
option = option.replace("-", "_")
|
||||
vars(args_ns)["c:%s:%s" % (section, option)] = value
|
||||
if unrecognized_args:
|
||||
parser.error("unrecognized arguments: %s" %
|
||||
" ".join(unrecognized_args))
|
||||
|
||||
# Preliminary configure logging
|
||||
if args.debug:
|
||||
args.logging_level = "debug"
|
||||
with contextlib.suppress(ValueError):
|
||||
log.set_level(config.DEFAULT_CONFIG_SCHEMA["logging"]["level"]["type"](
|
||||
args.logging_level))
|
||||
vars(args_ns).get("c:logging:level", "")))
|
||||
|
||||
# Update Radicale configuration according to arguments
|
||||
arguments_config = {}
|
||||
for group, actions in groups.items():
|
||||
section = group.title
|
||||
section_config = {}
|
||||
for action in actions:
|
||||
value = getattr(args, action)
|
||||
if value is not None:
|
||||
section_config[action.split('_', 1)[1]] = value
|
||||
if section_config:
|
||||
arguments_config[section] = section_config
|
||||
arguments_config: types.MUTABLE_CONFIG = {}
|
||||
for key, value in vars(args_ns).items():
|
||||
if key.startswith("c:"):
|
||||
_, section, option = key.split(":", maxsplit=2)
|
||||
arguments_config[section] = arguments_config.get(section, {})
|
||||
arguments_config[section][option] = value
|
||||
|
||||
try:
|
||||
configuration = config.load(config.parse_compound_paths(
|
||||
config.DEFAULT_CONFIG_PATH,
|
||||
os.environ.get("RADICALE_CONFIG"),
|
||||
os.pathsep.join(args.config) if args.config else None))
|
||||
os.pathsep.join(args_ns.config) if args_ns.config is not None
|
||||
else None))
|
||||
if arguments_config:
|
||||
configuration.update(arguments_config, "arguments")
|
||||
configuration.update(arguments_config, "command line arguments")
|
||||
except Exception as e:
|
||||
logger.fatal("Invalid configuration: %s", e, exc_info=True)
|
||||
logger.critical("Invalid configuration: %s", e, exc_info=True)
|
||||
sys.exit(1)
|
||||
|
||||
# Configure logging
|
||||
log.set_level(configuration.get("logging", "level"))
|
||||
log.set_level(cast(str, configuration.get("logging", "level")))
|
||||
|
||||
# Log configuration after logger is configured
|
||||
for source, miss in configuration.sources():
|
||||
logger.info("%s %s", "Skipped missing" if miss else "Loaded", source)
|
||||
|
||||
if args.verify_storage:
|
||||
if args_ns.verify_storage:
|
||||
logger.info("Verifying storage")
|
||||
try:
|
||||
storage_ = storage.load(configuration)
|
||||
with storage_.acquire_lock("r"):
|
||||
if not storage_.verify():
|
||||
logger.fatal("Storage verifcation failed")
|
||||
logger.critical("Storage verifcation failed")
|
||||
sys.exit(1)
|
||||
except Exception as e:
|
||||
logger.fatal("An exception occurred during storage verification: "
|
||||
"%s", e, exc_info=True)
|
||||
logger.critical("An exception occurred during storage "
|
||||
"verification: %s", e, exc_info=True)
|
||||
sys.exit(1)
|
||||
return
|
||||
|
||||
# Create a socket pair to notify the server of program shutdown
|
||||
shutdown_socket, shutdown_socket_out = socket.socketpair()
|
||||
|
||||
# SIGTERM and SIGINT (aka KeyboardInterrupt) shutdown the server
|
||||
def shutdown(signal_number, stack_frame):
|
||||
# Shutdown server when signal arrives
|
||||
def shutdown_signal_handler(signal_number: int,
|
||||
stack_frame: Optional[FrameType]) -> None:
|
||||
shutdown_socket.close()
|
||||
signal.signal(signal.SIGTERM, shutdown)
|
||||
signal.signal(signal.SIGINT, shutdown)
|
||||
for signal_number in exit_signal_numbers:
|
||||
signal.signal(signal_number, shutdown_signal_handler)
|
||||
|
||||
try:
|
||||
server.serve(configuration, shutdown_socket_out)
|
||||
except Exception as e:
|
||||
logger.fatal("An exception occurred during server startup: %s", e,
|
||||
exc_info=True)
|
||||
logger.critical("An exception occurred during server startup: %s", e,
|
||||
exc_info=True)
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -27,47 +27,49 @@ the built-in server (see ``radicale.server`` module).
|
|||
|
||||
import base64
|
||||
import datetime
|
||||
import io
|
||||
import logging
|
||||
import posixpath
|
||||
import pprint
|
||||
import random
|
||||
import time
|
||||
import zlib
|
||||
from http import client
|
||||
from xml.etree import ElementTree as ET
|
||||
from typing import Iterable, List, Mapping, Tuple, Union
|
||||
|
||||
import defusedxml.ElementTree as DefusedET
|
||||
import pkg_resources
|
||||
|
||||
from radicale import (auth, hook, httputils, log, pathutils, rights, storage,
|
||||
web, xmlutils)
|
||||
from radicale.app.delete import ApplicationDeleteMixin
|
||||
from radicale.app.get import ApplicationGetMixin
|
||||
from radicale.app.head import ApplicationHeadMixin
|
||||
from radicale.app.mkcalendar import ApplicationMkcalendarMixin
|
||||
from radicale.app.mkcol import ApplicationMkcolMixin
|
||||
from radicale.app.move import ApplicationMoveMixin
|
||||
from radicale.app.options import ApplicationOptionsMixin
|
||||
from radicale.app.propfind import ApplicationPropfindMixin
|
||||
from radicale.app.proppatch import ApplicationProppatchMixin
|
||||
from radicale.app.put import ApplicationPutMixin
|
||||
from radicale.app.report import ApplicationReportMixin
|
||||
from radicale import config, httputils, log, pathutils, types
|
||||
from radicale.app.base import ApplicationBase
|
||||
from radicale.app.delete import ApplicationPartDelete
|
||||
from radicale.app.get import ApplicationPartGet
|
||||
from radicale.app.head import ApplicationPartHead
|
||||
from radicale.app.mkcalendar import ApplicationPartMkcalendar
|
||||
from radicale.app.mkcol import ApplicationPartMkcol
|
||||
from radicale.app.move import ApplicationPartMove
|
||||
from radicale.app.options import ApplicationPartOptions
|
||||
from radicale.app.post import ApplicationPartPost
|
||||
from radicale.app.propfind import ApplicationPartPropfind
|
||||
from radicale.app.proppatch import ApplicationPartProppatch
|
||||
from radicale.app.put import ApplicationPartPut
|
||||
from radicale.app.report import ApplicationPartReport
|
||||
from radicale.log import logger
|
||||
|
||||
VERSION = pkg_resources.get_distribution("radicale").version
|
||||
# Combination of types.WSGIStartResponse and WSGI application return value
|
||||
_IntermediateResponse = Tuple[str, List[Tuple[str, str]], Iterable[bytes]]
|
||||
|
||||
|
||||
class Application(
|
||||
ApplicationDeleteMixin, ApplicationGetMixin, ApplicationHeadMixin,
|
||||
ApplicationMkcalendarMixin, ApplicationMkcolMixin,
|
||||
ApplicationMoveMixin, ApplicationOptionsMixin,
|
||||
ApplicationPropfindMixin, ApplicationProppatchMixin,
|
||||
ApplicationPutMixin, ApplicationReportMixin):
|
||||
|
||||
class Application(ApplicationPartDelete, ApplicationPartHead,
|
||||
ApplicationPartGet, ApplicationPartMkcalendar,
|
||||
ApplicationPartMkcol, ApplicationPartMove,
|
||||
ApplicationPartOptions, ApplicationPartPropfind,
|
||||
ApplicationPartProppatch, ApplicationPartPost,
|
||||
ApplicationPartPut, ApplicationPartReport, ApplicationBase):
|
||||
"""WSGI application."""
|
||||
|
||||
def __init__(self, configuration):
|
||||
_mask_passwords: bool
|
||||
_auth_delay: float
|
||||
_internal_server: bool
|
||||
_max_content_length: int
|
||||
_auth_realm: str
|
||||
_extra_headers: Mapping[str, str]
|
||||
|
||||
def __init__(self, configuration: config.Configuration) -> None:
|
||||
"""Initialize Application.
|
||||
|
||||
``configuration`` see ``radicale.config`` module.
|
||||
|
@ -75,85 +77,65 @@ class Application(
|
|||
this object, it is kept as an internal reference.
|
||||
|
||||
"""
|
||||
super().__init__()
|
||||
self.configuration = configuration
|
||||
self._auth = auth.load(configuration)
|
||||
self._storage = storage.load(configuration)
|
||||
self._rights = rights.load(configuration)
|
||||
self._web = web.load(configuration)
|
||||
self._encoding = configuration.get("encoding", "request")
|
||||
self._hook = hook.load(configuration)
|
||||
super().__init__(configuration)
|
||||
self._mask_passwords = configuration.get("logging", "mask_passwords")
|
||||
self._auth_delay = configuration.get("auth", "delay")
|
||||
self._internal_server = configuration.get("server", "_internal_server")
|
||||
self._max_content_length = configuration.get(
|
||||
"server", "max_content_length")
|
||||
self._auth_realm = configuration.get("auth", "realm")
|
||||
self._extra_headers = dict()
|
||||
for key in self.configuration.options("headers"):
|
||||
self._extra_headers[key] = configuration.get("headers", key)
|
||||
|
||||
def _headers_log(self, environ):
|
||||
"""Sanitize headers for logging."""
|
||||
request_environ = dict(environ)
|
||||
def _scrub_headers(self, environ: types.WSGIEnviron) -> types.WSGIEnviron:
|
||||
"""Mask passwords and cookies."""
|
||||
headers = dict(environ)
|
||||
if (self._mask_passwords and
|
||||
headers.get("HTTP_AUTHORIZATION", "").startswith("Basic")):
|
||||
headers["HTTP_AUTHORIZATION"] = "Basic **masked**"
|
||||
if headers.get("HTTP_COOKIE"):
|
||||
headers["HTTP_COOKIE"] = "**masked**"
|
||||
return headers
|
||||
|
||||
# Mask passwords
|
||||
mask_passwords = self.configuration.get("logging", "mask_passwords")
|
||||
authorization = request_environ.get("HTTP_AUTHORIZATION", "")
|
||||
if mask_passwords and authorization.startswith("Basic"):
|
||||
request_environ["HTTP_AUTHORIZATION"] = "Basic **masked**"
|
||||
if request_environ.get("HTTP_COOKIE"):
|
||||
request_environ["HTTP_COOKIE"] = "**masked**"
|
||||
|
||||
return request_environ
|
||||
|
||||
def _decode(self, text, environ):
|
||||
"""Try to magically decode ``text`` according to given ``environ``."""
|
||||
# List of charsets to try
|
||||
charsets = []
|
||||
|
||||
# First append content charset given in the request
|
||||
content_type = environ.get("CONTENT_TYPE")
|
||||
if content_type and "charset=" in content_type:
|
||||
charsets.append(
|
||||
content_type.split("charset=")[1].split(";")[0].strip())
|
||||
# Then append default Radicale charset
|
||||
charsets.append(self._encoding)
|
||||
# Then append various fallbacks
|
||||
charsets.append("utf-8")
|
||||
charsets.append("iso8859-1")
|
||||
|
||||
# Try to decode
|
||||
for charset in charsets:
|
||||
try:
|
||||
return text.decode(charset)
|
||||
except UnicodeDecodeError:
|
||||
pass
|
||||
raise UnicodeDecodeError
|
||||
|
||||
def __call__(self, environ, start_response):
|
||||
def __call__(self, environ: types.WSGIEnviron, start_response:
|
||||
types.WSGIStartResponse) -> Iterable[bytes]:
|
||||
with log.register_stream(environ["wsgi.errors"]):
|
||||
try:
|
||||
status, headers, answers = self._handle_request(environ)
|
||||
status_text, headers, answers = self._handle_request(environ)
|
||||
except Exception as e:
|
||||
try:
|
||||
method = str(environ["REQUEST_METHOD"])
|
||||
except Exception:
|
||||
method = "unknown"
|
||||
try:
|
||||
path = str(environ.get("PATH_INFO", ""))
|
||||
except Exception:
|
||||
path = ""
|
||||
logger.error("An exception occurred during %s request on %r: "
|
||||
"%s", method, path, e, exc_info=True)
|
||||
status, headers, answer = httputils.INTERNAL_SERVER_ERROR
|
||||
answer = answer.encode("ascii")
|
||||
status = "%d %s" % (
|
||||
status.value, client.responses.get(status, "Unknown"))
|
||||
headers = [
|
||||
("Content-Length", str(len(answer)))] + list(headers)
|
||||
"%s", environ.get("REQUEST_METHOD", "unknown"),
|
||||
environ.get("PATH_INFO", ""), e, exc_info=True)
|
||||
# Make minimal response
|
||||
status, raw_headers, raw_answer = (
|
||||
httputils.INTERNAL_SERVER_ERROR)
|
||||
assert isinstance(raw_answer, str)
|
||||
answer = raw_answer.encode("ascii")
|
||||
status_text = "%d %s" % (
|
||||
status, client.responses.get(status, "Unknown"))
|
||||
headers = [*raw_headers, ("Content-Length", str(len(answer)))]
|
||||
answers = [answer]
|
||||
start_response(status, headers)
|
||||
start_response(status_text, headers)
|
||||
if environ.get("REQUEST_METHOD") == "HEAD":
|
||||
return []
|
||||
return answers
|
||||
|
||||
def _handle_request(self, environ):
|
||||
def _handle_request(self, environ: types.WSGIEnviron
|
||||
) -> _IntermediateResponse:
|
||||
time_begin = datetime.datetime.now()
|
||||
request_method = environ["REQUEST_METHOD"].upper()
|
||||
unsafe_path = environ.get("PATH_INFO", "")
|
||||
|
||||
"""Manage a request."""
|
||||
def response(status, headers=(), answer=None):
|
||||
def response(status: int, headers: types.WSGIResponseHeaders,
|
||||
answer: Union[None, str, bytes]) -> _IntermediateResponse:
|
||||
"""Helper to create response from internal types.WSGIResponse"""
|
||||
headers = dict(headers)
|
||||
# Set content length
|
||||
if answer:
|
||||
if hasattr(answer, "encode"):
|
||||
answers = []
|
||||
if answer is not None:
|
||||
if isinstance(answer, str):
|
||||
logger.debug("Response content:\n%s", answer)
|
||||
headers["Content-Type"] += "; charset=%s" % self._encoding
|
||||
answer = answer.encode(self._encoding)
|
||||
|
@ -168,21 +150,20 @@ class Application(
|
|||
headers["Content-Encoding"] = "gzip"
|
||||
|
||||
headers["Content-Length"] = str(len(answer))
|
||||
answers.append(answer)
|
||||
|
||||
# Add extra headers set in configuration
|
||||
for key in self.configuration.options("headers"):
|
||||
headers[key] = self.configuration.get("headers", key)
|
||||
headers.update(self._extra_headers)
|
||||
|
||||
# Start response
|
||||
time_end = datetime.datetime.now()
|
||||
status = "%d %s" % (
|
||||
status_text = "%d %s" % (
|
||||
status, client.responses.get(status, "Unknown"))
|
||||
logger.info(
|
||||
"%s response status for %r%s in %.3f seconds: %s",
|
||||
environ["REQUEST_METHOD"], environ.get("PATH_INFO", ""),
|
||||
depthinfo, (time_end - time_begin).total_seconds(), status)
|
||||
logger.info("%s response status for %r%s in %.3f seconds: %s",
|
||||
request_method, unsafe_path, depthinfo,
|
||||
(time_end - time_begin).total_seconds(), status_text)
|
||||
# Return response content
|
||||
return status, list(headers.items()), [answer] if answer else []
|
||||
return status_text, list(headers.items()), answers
|
||||
|
||||
remote_host = "unknown"
|
||||
if environ.get("REMOTE_HOST"):
|
||||
|
@ -190,45 +171,56 @@ class Application(
|
|||
elif environ.get("REMOTE_ADDR"):
|
||||
remote_host = environ["REMOTE_ADDR"]
|
||||
if environ.get("HTTP_X_FORWARDED_FOR"):
|
||||
remote_host = "%r (forwarded by %s)" % (
|
||||
environ["HTTP_X_FORWARDED_FOR"], remote_host)
|
||||
remote_host = "%s (forwarded for %r)" % (
|
||||
remote_host, environ["HTTP_X_FORWARDED_FOR"])
|
||||
remote_useragent = ""
|
||||
if environ.get("HTTP_USER_AGENT"):
|
||||
remote_useragent = " using %r" % environ["HTTP_USER_AGENT"]
|
||||
depthinfo = ""
|
||||
if environ.get("HTTP_DEPTH"):
|
||||
depthinfo = " with depth %r" % environ["HTTP_DEPTH"]
|
||||
time_begin = datetime.datetime.now()
|
||||
logger.info(
|
||||
"%s request for %r%s received from %s%s",
|
||||
environ["REQUEST_METHOD"], environ.get("PATH_INFO", ""), depthinfo,
|
||||
remote_host, remote_useragent)
|
||||
headers = pprint.pformat(self._headers_log(environ))
|
||||
logger.debug("Request headers:\n%s", headers)
|
||||
logger.info("%s request for %r%s received from %s%s",
|
||||
request_method, unsafe_path, depthinfo,
|
||||
remote_host, remote_useragent)
|
||||
logger.debug("Request headers:\n%s",
|
||||
pprint.pformat(self._scrub_headers(environ)))
|
||||
|
||||
# Let reverse proxies overwrite SCRIPT_NAME
|
||||
if "HTTP_X_SCRIPT_NAME" in environ:
|
||||
# script_name must be removed from PATH_INFO by the client.
|
||||
unsafe_base_prefix = environ["HTTP_X_SCRIPT_NAME"]
|
||||
logger.debug("Script name overwritten by client: %r",
|
||||
unsafe_base_prefix)
|
||||
else:
|
||||
# SCRIPT_NAME is already removed from PATH_INFO, according to the
|
||||
# WSGI specification.
|
||||
unsafe_base_prefix = environ.get("SCRIPT_NAME", "")
|
||||
# Sanitize base prefix
|
||||
base_prefix = pathutils.sanitize_path(unsafe_base_prefix).rstrip("/")
|
||||
logger.debug("Sanitized script name: %r", base_prefix)
|
||||
# SCRIPT_NAME is already removed from PATH_INFO, according to the
|
||||
# WSGI specification.
|
||||
# Reverse proxies can overwrite SCRIPT_NAME with X-SCRIPT-NAME header
|
||||
base_prefix_src = ("HTTP_X_SCRIPT_NAME" if "HTTP_X_SCRIPT_NAME" in
|
||||
environ else "SCRIPT_NAME")
|
||||
base_prefix = environ.get(base_prefix_src, "")
|
||||
if base_prefix and base_prefix[0] != "/":
|
||||
logger.error("Base prefix (from %s) must start with '/': %r",
|
||||
base_prefix_src, base_prefix)
|
||||
if base_prefix_src == "HTTP_X_SCRIPT_NAME":
|
||||
return response(*httputils.BAD_REQUEST)
|
||||
return response(*httputils.INTERNAL_SERVER_ERROR)
|
||||
if base_prefix.endswith("/"):
|
||||
logger.warning("Base prefix (from %s) must not end with '/': %r",
|
||||
base_prefix_src, base_prefix)
|
||||
base_prefix = base_prefix.rstrip("/")
|
||||
logger.debug("Base prefix (from %s): %r", base_prefix_src, base_prefix)
|
||||
# Sanitize request URI (a WSGI server indicates with an empty path,
|
||||
# that the URL targets the application root without a trailing slash)
|
||||
path = pathutils.sanitize_path(environ.get("PATH_INFO", ""))
|
||||
path = pathutils.sanitize_path(unsafe_path)
|
||||
logger.debug("Sanitized path: %r", path)
|
||||
|
||||
# Get function corresponding to method
|
||||
function = getattr(self, "do_%s" % environ["REQUEST_METHOD"].upper())
|
||||
function = getattr(self, "do_%s" % request_method, None)
|
||||
if not function:
|
||||
return response(*httputils.METHOD_NOT_ALLOWED)
|
||||
|
||||
# If "/.well-known" is not available, clients query "/"
|
||||
if path == "/.well-known" or path.startswith("/.well-known/"):
|
||||
# Redirect all "…/.well-known/{caldav,carddav}" paths to "/".
|
||||
# This shouldn't be necessary but some clients like TbSync require it.
|
||||
# Status must be MOVED PERMANENTLY using FOUND causes problems
|
||||
if (path.rstrip("/").endswith("/.well-known/caldav") or
|
||||
path.rstrip("/").endswith("/.well-known/carddav")):
|
||||
return response(*httputils.redirect(
|
||||
base_prefix + "/", client.MOVED_PERMANENTLY))
|
||||
# Return NOT FOUND for all other paths containing ".well-knwon"
|
||||
if path.endswith("/.well-known") or "/.well-known/" in path:
|
||||
return response(*httputils.NOT_FOUND)
|
||||
|
||||
# Ask authentication backend to check rights
|
||||
|
@ -240,8 +232,9 @@ class Application(
|
|||
login, password = login or "", password or ""
|
||||
elif authorization.startswith("Basic"):
|
||||
authorization = authorization[len("Basic"):].strip()
|
||||
login, password = self._decode(base64.b64decode(
|
||||
authorization.encode("ascii")), environ).split(":", 1)
|
||||
login, password = httputils.decode_request(
|
||||
self.configuration, environ, base64.b64decode(
|
||||
authorization.encode("ascii"))).split(":", 1)
|
||||
|
||||
user = self._auth.login(login, password) or "" if login else ""
|
||||
if user and login == user:
|
||||
|
@ -249,11 +242,11 @@ class Application(
|
|||
elif user:
|
||||
logger.info("Successful login: %r -> %r", login, user)
|
||||
elif login:
|
||||
logger.info("Failed login attempt: %r", login)
|
||||
logger.warning("Failed login attempt from %s: %r",
|
||||
remote_host, login)
|
||||
# Random delay to avoid timing oracles and bruteforce attacks
|
||||
delay = self.configuration.get("auth", "delay")
|
||||
if delay > 0:
|
||||
random_delay = delay * (0.5 + random.random())
|
||||
if self._auth_delay > 0:
|
||||
random_delay = self._auth_delay * (0.5 + random.random())
|
||||
logger.debug("Sleeping %.3f seconds", random_delay)
|
||||
time.sleep(random_delay)
|
||||
|
||||
|
@ -266,8 +259,8 @@ class Application(
|
|||
if user:
|
||||
principal_path = "/%s/" % user
|
||||
with self._storage.acquire_lock("r", user):
|
||||
principal = next(self._storage.discover(
|
||||
principal_path, depth="1"), None)
|
||||
principal = next(iter(self._storage.discover(
|
||||
principal_path, depth="1")), None)
|
||||
if not principal:
|
||||
if "W" in self._rights.authorization(user, principal_path):
|
||||
with self._storage.acquire_lock("w", user):
|
||||
|
@ -281,13 +274,12 @@ class Application(
|
|||
logger.warning("Access to principal path %r denied by "
|
||||
"rights backend", principal_path)
|
||||
|
||||
if self.configuration.get("server", "_internal_server"):
|
||||
if self._internal_server:
|
||||
# Verify content length
|
||||
content_length = int(environ.get("CONTENT_LENGTH") or 0)
|
||||
if content_length:
|
||||
max_content_length = self.configuration.get(
|
||||
"server", "max_content_length")
|
||||
if max_content_length and content_length > max_content_length:
|
||||
if (self._max_content_length > 0 and
|
||||
content_length > self._max_content_length):
|
||||
logger.info("Request body too large: %d", content_length)
|
||||
return response(*httputils.REQUEST_ENTITY_TOO_LARGE)
|
||||
|
||||
|
@ -305,94 +297,9 @@ class Application(
|
|||
# Unknown or unauthorized user
|
||||
logger.debug("Asking client for authentication")
|
||||
status = client.UNAUTHORIZED
|
||||
realm = self.configuration.get("auth", "realm")
|
||||
headers = dict(headers)
|
||||
headers.update({
|
||||
"WWW-Authenticate":
|
||||
"Basic realm=\"%s\"" % realm})
|
||||
"Basic realm=\"%s\"" % self._auth_realm})
|
||||
|
||||
return response(status, headers, answer)
|
||||
|
||||
def _read_raw_content(self, environ):
|
||||
content_length = int(environ.get("CONTENT_LENGTH") or 0)
|
||||
if not content_length:
|
||||
return b""
|
||||
content = environ["wsgi.input"].read(content_length)
|
||||
if len(content) < content_length:
|
||||
raise RuntimeError("Request body too short: %d" % len(content))
|
||||
return content
|
||||
|
||||
def _read_content(self, environ):
|
||||
content = self._decode(self._read_raw_content(environ), environ)
|
||||
logger.debug("Request content:\n%s", content)
|
||||
return content
|
||||
|
||||
def _read_xml_content(self, environ):
|
||||
content = self._decode(self._read_raw_content(environ), environ)
|
||||
if not content:
|
||||
return None
|
||||
try:
|
||||
xml_content = DefusedET.fromstring(content)
|
||||
except ET.ParseError as e:
|
||||
logger.debug("Request content (Invalid XML):\n%s", content)
|
||||
raise RuntimeError("Failed to parse XML: %s" % e) from e
|
||||
if logger.isEnabledFor(logging.DEBUG):
|
||||
logger.debug("Request content:\n%s",
|
||||
xmlutils.pretty_xml(xml_content))
|
||||
return xml_content
|
||||
|
||||
def _write_xml_content(self, xml_content):
|
||||
if logger.isEnabledFor(logging.DEBUG):
|
||||
logger.debug("Response content:\n%s",
|
||||
xmlutils.pretty_xml(xml_content))
|
||||
f = io.BytesIO()
|
||||
ET.ElementTree(xml_content).write(f, encoding=self._encoding,
|
||||
xml_declaration=True)
|
||||
return f.getvalue()
|
||||
|
||||
def _webdav_error_response(self, status, human_tag):
|
||||
"""Generate XML error response."""
|
||||
headers = {"Content-Type": "text/xml; charset=%s" % self._encoding}
|
||||
content = self._write_xml_content(xmlutils.webdav_error(human_tag))
|
||||
return status, headers, content
|
||||
|
||||
|
||||
class Access:
|
||||
"""Helper class to check access rights of an item"""
|
||||
|
||||
def __init__(self, rights, user, path):
|
||||
self._rights = rights
|
||||
self.user = user
|
||||
self.path = path
|
||||
self.parent_path = pathutils.unstrip_path(
|
||||
posixpath.dirname(pathutils.strip_path(path)), True)
|
||||
self.permissions = self._rights.authorization(self.user, self.path)
|
||||
self._parent_permissions = None
|
||||
|
||||
@property
|
||||
def parent_permissions(self):
|
||||
if self.path == self.parent_path:
|
||||
return self.permissions
|
||||
if self._parent_permissions is None:
|
||||
self._parent_permissions = self._rights.authorization(
|
||||
self.user, self.parent_path)
|
||||
return self._parent_permissions
|
||||
|
||||
def check(self, permission, item=None):
|
||||
if permission not in "rw":
|
||||
raise ValueError("Invalid permission argument: %r" % permission)
|
||||
if not item:
|
||||
permissions = permission + permission.upper()
|
||||
parent_permissions = permission
|
||||
elif isinstance(item, storage.BaseCollection):
|
||||
if item.get_meta("tag"):
|
||||
permissions = permission
|
||||
else:
|
||||
permissions = permission.upper()
|
||||
parent_permissions = ""
|
||||
else:
|
||||
permissions = ""
|
||||
parent_permissions = permission
|
||||
return bool(rights.intersect(self.permissions, permissions) or (
|
||||
self.path != self.parent_path and
|
||||
rights.intersect(self.parent_permissions, parent_permissions)))
|
||||
|
|
134
radicale/app/base.py
Normal file
134
radicale/app/base.py
Normal file
|
@ -0,0 +1,134 @@
|
|||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2020 Unrud <unrud@outlook.com>
|
||||
#
|
||||
# This library is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with Radicale. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import io
|
||||
import logging
|
||||
import posixpath
|
||||
import sys
|
||||
import xml.etree.ElementTree as ET
|
||||
from typing import Optional
|
||||
|
||||
# HACK: https://github.com/tiran/defusedxml/issues/54
|
||||
import defusedxml.ElementTree as DefusedET # isort:skip
|
||||
|
||||
from radicale import (auth, hook, config, httputils, pathutils, rights, storage,
|
||||
types, web, xmlutils)
|
||||
from radicale.log import logger
|
||||
|
||||
sys.modules["xml.etree"].ElementTree = ET # type:ignore[attr-defined]
|
||||
|
||||
|
||||
class ApplicationBase:
|
||||
|
||||
configuration: config.Configuration
|
||||
_auth: auth.BaseAuth
|
||||
_storage: storage.BaseStorage
|
||||
_rights: rights.BaseRights
|
||||
_web: web.BaseWeb
|
||||
_encoding: str
|
||||
_hook: hook.BaseHook
|
||||
|
||||
def __init__(self, configuration: config.Configuration) -> None:
|
||||
self.configuration = configuration
|
||||
self._auth = auth.load(configuration)
|
||||
self._storage = storage.load(configuration)
|
||||
self._rights = rights.load(configuration)
|
||||
self._web = web.load(configuration)
|
||||
self._encoding = configuration.get("encoding", "request")
|
||||
self._hook = hook.load(configuration)
|
||||
|
||||
def _read_xml_request_body(self, environ: types.WSGIEnviron
|
||||
) -> Optional[ET.Element]:
|
||||
content = httputils.decode_request(
|
||||
self.configuration, environ,
|
||||
httputils.read_raw_request_body(self.configuration, environ))
|
||||
if not content:
|
||||
return None
|
||||
try:
|
||||
xml_content = DefusedET.fromstring(content)
|
||||
except ET.ParseError as e:
|
||||
logger.debug("Request content (Invalid XML):\n%s", content)
|
||||
raise RuntimeError("Failed to parse XML: %s" % e) from e
|
||||
if logger.isEnabledFor(logging.DEBUG):
|
||||
logger.debug("Request content:\n%s",
|
||||
xmlutils.pretty_xml(xml_content))
|
||||
return xml_content
|
||||
|
||||
def _xml_response(self, xml_content: ET.Element) -> bytes:
|
||||
if logger.isEnabledFor(logging.DEBUG):
|
||||
logger.debug("Response content:\n%s",
|
||||
xmlutils.pretty_xml(xml_content))
|
||||
f = io.BytesIO()
|
||||
ET.ElementTree(xml_content).write(f, encoding=self._encoding,
|
||||
xml_declaration=True)
|
||||
return f.getvalue()
|
||||
|
||||
def _webdav_error_response(self, status: int, human_tag: str
|
||||
) -> types.WSGIResponse:
|
||||
"""Generate XML error response."""
|
||||
headers = {"Content-Type": "text/xml; charset=%s" % self._encoding}
|
||||
content = self._xml_response(xmlutils.webdav_error(human_tag))
|
||||
return status, headers, content
|
||||
|
||||
|
||||
class Access:
|
||||
"""Helper class to check access rights of an item"""
|
||||
|
||||
user: str
|
||||
path: str
|
||||
parent_path: str
|
||||
permissions: str
|
||||
_rights: rights.BaseRights
|
||||
_parent_permissions: Optional[str]
|
||||
|
||||
def __init__(self, rights: rights.BaseRights, user: str, path: str
|
||||
) -> None:
|
||||
self._rights = rights
|
||||
self.user = user
|
||||
self.path = path
|
||||
self.parent_path = pathutils.unstrip_path(
|
||||
posixpath.dirname(pathutils.strip_path(path)), True)
|
||||
self.permissions = self._rights.authorization(self.user, self.path)
|
||||
self._parent_permissions = None
|
||||
|
||||
@property
|
||||
def parent_permissions(self) -> str:
|
||||
if self.path == self.parent_path:
|
||||
return self.permissions
|
||||
if self._parent_permissions is None:
|
||||
self._parent_permissions = self._rights.authorization(
|
||||
self.user, self.parent_path)
|
||||
return self._parent_permissions
|
||||
|
||||
def check(self, permission: str,
|
||||
item: Optional[types.CollectionOrItem] = None) -> bool:
|
||||
if permission not in "rw":
|
||||
raise ValueError("Invalid permission argument: %r" % permission)
|
||||
if not item:
|
||||
permissions = permission + permission.upper()
|
||||
parent_permissions = permission
|
||||
elif isinstance(item, storage.BaseCollection):
|
||||
if item.tag:
|
||||
permissions = permission
|
||||
else:
|
||||
permissions = permission.upper()
|
||||
parent_permissions = ""
|
||||
else:
|
||||
permissions = ""
|
||||
parent_permissions = permission
|
||||
return bool(rights.intersect(self.permissions, permissions) or (
|
||||
self.path != self.parent_path and
|
||||
rights.intersect(self.parent_permissions, parent_permissions)))
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -21,17 +21,17 @@ import posixpath
|
|||
from http import client
|
||||
from urllib.parse import quote
|
||||
|
||||
from radicale import app, httputils, pathutils, storage, xmlutils
|
||||
from radicale import httputils, pathutils, storage, types, xmlutils
|
||||
from radicale.app.base import Access, ApplicationBase
|
||||
from radicale.log import logger
|
||||
|
||||
|
||||
def propose_filename(collection):
|
||||
def propose_filename(collection: storage.BaseCollection) -> str:
|
||||
"""Propose a filename for a collection."""
|
||||
tag = collection.get_meta("tag")
|
||||
if tag == "VADDRESSBOOK":
|
||||
if collection.tag == "VADDRESSBOOK":
|
||||
fallback_title = "Address book"
|
||||
suffix = ".vcf"
|
||||
elif tag == "VCALENDAR":
|
||||
elif collection.tag == "VCALENDAR":
|
||||
fallback_title = "Calendar"
|
||||
suffix = ".ics"
|
||||
else:
|
||||
|
@ -43,8 +43,9 @@ def propose_filename(collection):
|
|||
return title
|
||||
|
||||
|
||||
class ApplicationGetMixin:
|
||||
def _content_disposition_attachement(self, filename):
|
||||
class ApplicationPartGet(ApplicationBase):
|
||||
|
||||
def _content_disposition_attachement(self, filename: str) -> str:
|
||||
value = "attachement"
|
||||
try:
|
||||
encoded_filename = quote(filename, encoding=self._encoding)
|
||||
|
@ -56,25 +57,27 @@ class ApplicationGetMixin:
|
|||
value += "; filename*=%s''%s" % (self._encoding, encoded_filename)
|
||||
return value
|
||||
|
||||
def do_GET(self, environ, base_prefix, path, user):
|
||||
def do_GET(self, environ: types.WSGIEnviron, base_prefix: str, path: str,
|
||||
user: str) -> types.WSGIResponse:
|
||||
"""Manage GET request."""
|
||||
# Redirect to .web if the root URL is requested
|
||||
# Redirect to /.web if the root path is requested
|
||||
if not pathutils.strip_path(path):
|
||||
web_path = ".web"
|
||||
if not environ.get("PATH_INFO"):
|
||||
web_path = posixpath.join(posixpath.basename(base_prefix),
|
||||
web_path)
|
||||
return (client.FOUND,
|
||||
{"Location": web_path, "Content-Type": "text/plain"},
|
||||
"Redirected to %s" % web_path)
|
||||
# Dispatch .web URL to web module
|
||||
return httputils.redirect(base_prefix + "/.web")
|
||||
if path == "/.web" or path.startswith("/.web/"):
|
||||
# Redirect to sanitized path for all subpaths of /.web
|
||||
unsafe_path = environ.get("PATH_INFO", "")
|
||||
if unsafe_path != path:
|
||||
location = base_prefix + path
|
||||
logger.info("Redirecting to sanitized path: %r ==> %r",
|
||||
base_prefix + unsafe_path, location)
|
||||
return httputils.redirect(location, client.MOVED_PERMANENTLY)
|
||||
# Dispatch /.web path to web module
|
||||
return self._web.get(environ, base_prefix, path, user)
|
||||
access = app.Access(self._rights, user, path)
|
||||
access = Access(self._rights, user, path)
|
||||
if not access.check("r") and "i" not in access.permissions:
|
||||
return httputils.NOT_ALLOWED
|
||||
with self._storage.acquire_lock("r", user):
|
||||
item = next(self._storage.discover(path), None)
|
||||
item = next(iter(self._storage.discover(path)), None)
|
||||
if not item:
|
||||
return httputils.NOT_FOUND
|
||||
if access.check("r", item):
|
||||
|
@ -84,11 +87,10 @@ class ApplicationGetMixin:
|
|||
else:
|
||||
return httputils.NOT_ALLOWED
|
||||
if isinstance(item, storage.BaseCollection):
|
||||
tag = item.get_meta("tag")
|
||||
if not tag:
|
||||
if not item.tag:
|
||||
return (httputils.NOT_ALLOWED if limited_access else
|
||||
httputils.DIRECTORY_LISTING)
|
||||
content_type = xmlutils.MIMETYPES[tag]
|
||||
content_type = xmlutils.MIMETYPES[item.tag]
|
||||
content_disposition = self._content_disposition_attachement(
|
||||
propose_filename(item))
|
||||
elif limited_access:
|
||||
|
@ -96,6 +98,7 @@ class ApplicationGetMixin:
|
|||
else:
|
||||
content_type = xmlutils.OBJECT_MIMETYPES[item.name]
|
||||
content_disposition = ""
|
||||
assert item.last_modified
|
||||
headers = {
|
||||
"Content-Type": content_type,
|
||||
"Last-Modified": item.last_modified,
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -17,9 +17,15 @@
|
|||
# You should have received a copy of the GNU General Public License
|
||||
# along with Radicale. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
from radicale import types
|
||||
from radicale.app.base import ApplicationBase
|
||||
from radicale.app.get import ApplicationPartGet
|
||||
|
||||
class ApplicationHeadMixin:
|
||||
def do_HEAD(self, environ, base_prefix, path, user):
|
||||
|
||||
class ApplicationPartHead(ApplicationPartGet, ApplicationBase):
|
||||
|
||||
def do_HEAD(self, environ: types.WSGIEnviron, base_prefix: str, path: str,
|
||||
user: str) -> types.WSGIResponse:
|
||||
"""Manage HEAD request."""
|
||||
status, headers, _ = self.do_GET(environ, base_prefix, path, user)
|
||||
return status, headers, None
|
||||
# Body is dropped in `Application.__call__` for HEAD requests
|
||||
return self.do_GET(environ, base_prefix, path, user)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -21,48 +21,51 @@ import posixpath
|
|||
import socket
|
||||
from http import client
|
||||
|
||||
from radicale import httputils
|
||||
from radicale import item as radicale_item
|
||||
from radicale import pathutils, storage, xmlutils
|
||||
import radicale.item as radicale_item
|
||||
from radicale import httputils, pathutils, storage, types, xmlutils
|
||||
from radicale.app.base import ApplicationBase
|
||||
from radicale.log import logger
|
||||
|
||||
|
||||
class ApplicationMkcalendarMixin:
|
||||
def do_MKCALENDAR(self, environ, base_prefix, path, user):
|
||||
class ApplicationPartMkcalendar(ApplicationBase):
|
||||
|
||||
def do_MKCALENDAR(self, environ: types.WSGIEnviron, base_prefix: str,
|
||||
path: str, user: str) -> types.WSGIResponse:
|
||||
"""Manage MKCALENDAR request."""
|
||||
if "w" not in self._rights.authorization(user, path):
|
||||
return httputils.NOT_ALLOWED
|
||||
try:
|
||||
xml_content = self._read_xml_content(environ)
|
||||
xml_content = self._read_xml_request_body(environ)
|
||||
except RuntimeError as e:
|
||||
logger.warning(
|
||||
"Bad MKCALENDAR request on %r: %s", path, e, exc_info=True)
|
||||
return httputils.BAD_REQUEST
|
||||
except socket.timeout:
|
||||
logger.debug("client timed out", exc_info=True)
|
||||
logger.debug("Client timed out", exc_info=True)
|
||||
return httputils.REQUEST_TIMEOUT
|
||||
# Prepare before locking
|
||||
props = xmlutils.props_from_request(xml_content)
|
||||
props["tag"] = "VCALENDAR"
|
||||
# TODO: use this?
|
||||
# timezone = props.get("C:calendar-timezone")
|
||||
props_with_remove = xmlutils.props_from_request(xml_content)
|
||||
props_with_remove["tag"] = "VCALENDAR"
|
||||
try:
|
||||
radicale_item.check_and_sanitize_props(props)
|
||||
props = radicale_item.check_and_sanitize_props(props_with_remove)
|
||||
except ValueError as e:
|
||||
logger.warning(
|
||||
"Bad MKCALENDAR request on %r: %s", path, e, exc_info=True)
|
||||
return httputils.BAD_REQUEST
|
||||
# TODO: use this?
|
||||
# timezone = props.get("C:calendar-timezone")
|
||||
with self._storage.acquire_lock("w", user):
|
||||
item = next(self._storage.discover(path), None)
|
||||
item = next(iter(self._storage.discover(path)), None)
|
||||
if item:
|
||||
return self._webdav_error_response(
|
||||
client.CONFLICT, "D:resource-must-be-null")
|
||||
parent_path = pathutils.unstrip_path(
|
||||
posixpath.dirname(pathutils.strip_path(path)), True)
|
||||
parent_item = next(self._storage.discover(parent_path), None)
|
||||
parent_item = next(iter(self._storage.discover(parent_path)), None)
|
||||
if not parent_item:
|
||||
return httputils.CONFLICT
|
||||
if (not isinstance(parent_item, storage.BaseCollection) or
|
||||
parent_item.get_meta("tag")):
|
||||
parent_item.tag):
|
||||
return httputils.FORBIDDEN
|
||||
try:
|
||||
self._storage.create_collection(path, props=props)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -21,31 +21,33 @@ import posixpath
|
|||
import socket
|
||||
from http import client
|
||||
|
||||
from radicale import httputils
|
||||
from radicale import item as radicale_item
|
||||
from radicale import pathutils, rights, storage, xmlutils
|
||||
import radicale.item as radicale_item
|
||||
from radicale import httputils, pathutils, rights, storage, types, xmlutils
|
||||
from radicale.app.base import ApplicationBase
|
||||
from radicale.log import logger
|
||||
|
||||
|
||||
class ApplicationMkcolMixin:
|
||||
def do_MKCOL(self, environ, base_prefix, path, user):
|
||||
class ApplicationPartMkcol(ApplicationBase):
|
||||
|
||||
def do_MKCOL(self, environ: types.WSGIEnviron, base_prefix: str,
|
||||
path: str, user: str) -> types.WSGIResponse:
|
||||
"""Manage MKCOL request."""
|
||||
permissions = self._rights.authorization(user, path)
|
||||
if not rights.intersect(permissions, "Ww"):
|
||||
return httputils.NOT_ALLOWED
|
||||
try:
|
||||
xml_content = self._read_xml_content(environ)
|
||||
xml_content = self._read_xml_request_body(environ)
|
||||
except RuntimeError as e:
|
||||
logger.warning(
|
||||
"Bad MKCOL request on %r: %s", path, e, exc_info=True)
|
||||
return httputils.BAD_REQUEST
|
||||
except socket.timeout:
|
||||
logger.debug("client timed out", exc_info=True)
|
||||
logger.debug("Client timed out", exc_info=True)
|
||||
return httputils.REQUEST_TIMEOUT
|
||||
# Prepare before locking
|
||||
props = xmlutils.props_from_request(xml_content)
|
||||
props_with_remove = xmlutils.props_from_request(xml_content)
|
||||
try:
|
||||
radicale_item.check_and_sanitize_props(props)
|
||||
props = radicale_item.check_and_sanitize_props(props_with_remove)
|
||||
except ValueError as e:
|
||||
logger.warning(
|
||||
"Bad MKCOL request on %r: %s", path, e, exc_info=True)
|
||||
|
@ -54,16 +56,16 @@ class ApplicationMkcolMixin:
|
|||
not props.get("tag") and "W" not in permissions):
|
||||
return httputils.NOT_ALLOWED
|
||||
with self._storage.acquire_lock("w", user):
|
||||
item = next(self._storage.discover(path), None)
|
||||
item = next(iter(self._storage.discover(path)), None)
|
||||
if item:
|
||||
return httputils.METHOD_NOT_ALLOWED
|
||||
parent_path = pathutils.unstrip_path(
|
||||
posixpath.dirname(pathutils.strip_path(path)), True)
|
||||
parent_item = next(self._storage.discover(parent_path), None)
|
||||
parent_item = next(iter(self._storage.discover(parent_path)), None)
|
||||
if not parent_item:
|
||||
return httputils.CONFLICT
|
||||
if (not isinstance(parent_item, storage.BaseCollection) or
|
||||
parent_item.get_meta("tag")):
|
||||
parent_item.tag):
|
||||
return httputils.FORBIDDEN
|
||||
try:
|
||||
self._storage.create_collection(path, props=props)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -21,12 +21,15 @@ import posixpath
|
|||
from http import client
|
||||
from urllib.parse import urlparse
|
||||
|
||||
from radicale import app, httputils, pathutils, storage
|
||||
from radicale import httputils, pathutils, storage, types
|
||||
from radicale.app.base import Access, ApplicationBase
|
||||
from radicale.log import logger
|
||||
|
||||
|
||||
class ApplicationMoveMixin:
|
||||
def do_MOVE(self, environ, base_prefix, path, user):
|
||||
class ApplicationPartMove(ApplicationBase):
|
||||
|
||||
def do_MOVE(self, environ: types.WSGIEnviron, base_prefix: str,
|
||||
path: str, user: str) -> types.WSGIResponse:
|
||||
"""Manage MOVE request."""
|
||||
raw_dest = environ.get("HTTP_DESTINATION", "")
|
||||
to_url = urlparse(raw_dest)
|
||||
|
@ -34,7 +37,7 @@ class ApplicationMoveMixin:
|
|||
logger.info("Unsupported destination address: %r", raw_dest)
|
||||
# Remote destination server, not supported
|
||||
return httputils.REMOTE_DESTINATION
|
||||
access = app.Access(self._rights, user, path)
|
||||
access = Access(self._rights, user, path)
|
||||
if not access.check("w"):
|
||||
return httputils.NOT_ALLOWED
|
||||
to_path = pathutils.sanitize_path(to_url.path)
|
||||
|
@ -43,12 +46,12 @@ class ApplicationMoveMixin:
|
|||
"start with base prefix", to_path, path)
|
||||
return httputils.NOT_ALLOWED
|
||||
to_path = to_path[len(base_prefix):]
|
||||
to_access = app.Access(self._rights, user, to_path)
|
||||
to_access = Access(self._rights, user, to_path)
|
||||
if not to_access.check("w"):
|
||||
return httputils.NOT_ALLOWED
|
||||
|
||||
with self._storage.acquire_lock("w", user):
|
||||
item = next(self._storage.discover(path), None)
|
||||
item = next(iter(self._storage.discover(path)), None)
|
||||
if not item:
|
||||
return httputils.NOT_FOUND
|
||||
if (not access.check("w", item) or
|
||||
|
@ -58,17 +61,19 @@ class ApplicationMoveMixin:
|
|||
# TODO: support moving collections
|
||||
return httputils.METHOD_NOT_ALLOWED
|
||||
|
||||
to_item = next(self._storage.discover(to_path), None)
|
||||
to_item = next(iter(self._storage.discover(to_path)), None)
|
||||
if isinstance(to_item, storage.BaseCollection):
|
||||
return httputils.FORBIDDEN
|
||||
to_parent_path = pathutils.unstrip_path(
|
||||
posixpath.dirname(pathutils.strip_path(to_path)), True)
|
||||
to_collection = next(
|
||||
self._storage.discover(to_parent_path), None)
|
||||
to_collection = next(iter(
|
||||
self._storage.discover(to_parent_path)), None)
|
||||
if not to_collection:
|
||||
return httputils.CONFLICT
|
||||
tag = item.collection.get_meta("tag")
|
||||
if not tag or tag != to_collection.get_meta("tag"):
|
||||
assert isinstance(to_collection, storage.BaseCollection)
|
||||
assert item.collection is not None
|
||||
collection_tag = item.collection.tag
|
||||
if not collection_tag or collection_tag != to_collection.tag:
|
||||
return httputils.FORBIDDEN
|
||||
if to_item and environ.get("HTTP_OVERWRITE", "F") != "T":
|
||||
return httputils.PRECONDITION_FAILED
|
||||
|
@ -78,7 +83,7 @@ class ApplicationMoveMixin:
|
|||
to_collection.has_uid(item.uid)):
|
||||
return self._webdav_error_response(
|
||||
client.CONFLICT, "%s:no-uid-conflict" % (
|
||||
"C" if tag == "VCALENDAR" else "CR"))
|
||||
"C" if collection_tag == "VCALENDAR" else "CR"))
|
||||
to_href = posixpath.basename(pathutils.strip_path(to_path))
|
||||
try:
|
||||
self._storage.move(item, to_collection, to_href)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -19,11 +19,14 @@
|
|||
|
||||
from http import client
|
||||
|
||||
from radicale import httputils
|
||||
from radicale import httputils, types
|
||||
from radicale.app.base import ApplicationBase
|
||||
|
||||
|
||||
class ApplicationOptionsMixin:
|
||||
def do_OPTIONS(self, environ, base_prefix, path, user):
|
||||
class ApplicationPartOptions(ApplicationBase):
|
||||
|
||||
def do_OPTIONS(self, environ: types.WSGIEnviron, base_prefix: str,
|
||||
path: str, user: str) -> types.WSGIResponse:
|
||||
"""Manage OPTIONS request."""
|
||||
headers = {
|
||||
"Allow": ", ".join(
|
||||
|
|
32
radicale/app/post.py
Normal file
32
radicale/app/post.py
Normal file
|
@ -0,0 +1,32 @@
|
|||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
# Copyright © 2020 Tom Hacohen <tom@stosb.com>
|
||||
#
|
||||
# This library is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with Radicale. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
from radicale import httputils, types
|
||||
from radicale.app.base import ApplicationBase
|
||||
|
||||
|
||||
class ApplicationPartPost(ApplicationBase):
|
||||
|
||||
def do_POST(self, environ: types.WSGIEnviron, base_prefix: str,
|
||||
path: str, user: str) -> types.WSGIResponse:
|
||||
"""Manage POST request."""
|
||||
if path == "/.web" or path.startswith("/.web/"):
|
||||
return self._web.post(environ, base_prefix, path, user)
|
||||
return httputils.METHOD_NOT_ALLOWED
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -21,15 +21,19 @@ import collections
|
|||
import itertools
|
||||
import posixpath
|
||||
import socket
|
||||
import xml.etree.ElementTree as ET
|
||||
from http import client
|
||||
from xml.etree import ElementTree as ET
|
||||
from typing import Dict, Iterable, Iterator, List, Optional, Sequence, Tuple
|
||||
|
||||
from radicale import app, httputils, pathutils, rights, storage, xmlutils
|
||||
from radicale import httputils, pathutils, rights, storage, types, xmlutils
|
||||
from radicale.app.base import Access, ApplicationBase
|
||||
from radicale.log import logger
|
||||
|
||||
|
||||
def xml_propfind(base_prefix, path, xml_request, allowed_items, user,
|
||||
encoding):
|
||||
def xml_propfind(base_prefix: str, path: str,
|
||||
xml_request: Optional[ET.Element],
|
||||
allowed_items: Iterable[Tuple[types.CollectionOrItem, str]],
|
||||
user: str, encoding: str) -> Optional[ET.Element]:
|
||||
"""Read and answer PROPFIND requests.
|
||||
|
||||
Read rfc4918-9.1 for info.
|
||||
|
@ -40,24 +44,24 @@ def xml_propfind(base_prefix, path, xml_request, allowed_items, user,
|
|||
"""
|
||||
# A client may choose not to submit a request body. An empty PROPFIND
|
||||
# request body MUST be treated as if it were an 'allprop' request.
|
||||
top_tag = (xml_request[0] if xml_request is not None else
|
||||
ET.Element(xmlutils.make_clark("D:allprop")))
|
||||
top_element = (xml_request[0] if xml_request is not None else
|
||||
ET.Element(xmlutils.make_clark("D:allprop")))
|
||||
|
||||
props = ()
|
||||
props: List[str] = []
|
||||
allprop = False
|
||||
propname = False
|
||||
if top_tag.tag == xmlutils.make_clark("D:allprop"):
|
||||
if top_element.tag == xmlutils.make_clark("D:allprop"):
|
||||
allprop = True
|
||||
elif top_tag.tag == xmlutils.make_clark("D:propname"):
|
||||
elif top_element.tag == xmlutils.make_clark("D:propname"):
|
||||
propname = True
|
||||
elif top_tag.tag == xmlutils.make_clark("D:prop"):
|
||||
props = [prop.tag for prop in top_tag]
|
||||
elif top_element.tag == xmlutils.make_clark("D:prop"):
|
||||
props.extend(prop.tag for prop in top_element)
|
||||
|
||||
if xmlutils.make_clark("D:current-user-principal") in props and not user:
|
||||
# Ask for authentication
|
||||
# Returning the DAV:unauthenticated pseudo-principal as specified in
|
||||
# RFC 5397 doesn't seem to work with DAVx5.
|
||||
return client.FORBIDDEN, None
|
||||
return None
|
||||
|
||||
# Writing answer
|
||||
multistatus = ET.Element(xmlutils.make_clark("D:multistatus"))
|
||||
|
@ -68,29 +72,32 @@ def xml_propfind(base_prefix, path, xml_request, allowed_items, user,
|
|||
base_prefix, path, item, props, user, encoding, write=write,
|
||||
allprop=allprop, propname=propname))
|
||||
|
||||
return client.MULTI_STATUS, multistatus
|
||||
return multistatus
|
||||
|
||||
|
||||
def xml_propfind_response(base_prefix, path, item, props, user, encoding,
|
||||
write=False, propname=False, allprop=False):
|
||||
def xml_propfind_response(
|
||||
base_prefix: str, path: str, item: types.CollectionOrItem,
|
||||
props: Sequence[str], user: str, encoding: str, write: bool = False,
|
||||
propname: bool = False, allprop: bool = False) -> ET.Element:
|
||||
"""Build and return a PROPFIND response."""
|
||||
if propname and allprop or (props and (propname or allprop)):
|
||||
raise ValueError("Only use one of props, propname and allprops")
|
||||
is_collection = isinstance(item, storage.BaseCollection)
|
||||
if is_collection:
|
||||
is_leaf = item.get_meta("tag") in ("VADDRESSBOOK", "VCALENDAR")
|
||||
collection = item
|
||||
else:
|
||||
collection = item.collection
|
||||
|
||||
response = ET.Element(xmlutils.make_clark("D:response"))
|
||||
href = ET.Element(xmlutils.make_clark("D:href"))
|
||||
if is_collection:
|
||||
# Some clients expect collections to end with /
|
||||
if isinstance(item, storage.BaseCollection):
|
||||
is_collection = True
|
||||
is_leaf = item.tag in ("VADDRESSBOOK", "VCALENDAR")
|
||||
collection = item
|
||||
# Some clients expect collections to end with `/`
|
||||
uri = pathutils.unstrip_path(item.path, True)
|
||||
else:
|
||||
uri = pathutils.unstrip_path(
|
||||
posixpath.join(collection.path, item.href))
|
||||
is_collection = is_leaf = False
|
||||
assert item.collection is not None
|
||||
assert item.href
|
||||
collection = item.collection
|
||||
uri = pathutils.unstrip_path(posixpath.join(
|
||||
collection.path, item.href))
|
||||
response = ET.Element(xmlutils.make_clark("D:response"))
|
||||
href = ET.Element(xmlutils.make_clark("D:href"))
|
||||
href.text = xmlutils.make_href(base_prefix, uri)
|
||||
response.append(href)
|
||||
|
||||
|
@ -120,12 +127,12 @@ def xml_propfind_response(base_prefix, path, item, props, user, encoding,
|
|||
if is_leaf:
|
||||
props.append(xmlutils.make_clark("D:displayname"))
|
||||
props.append(xmlutils.make_clark("D:sync-token"))
|
||||
if collection.get_meta("tag") == "VCALENDAR":
|
||||
if collection.tag == "VCALENDAR":
|
||||
props.append(xmlutils.make_clark("CS:getctag"))
|
||||
props.append(
|
||||
xmlutils.make_clark("C:supported-calendar-component-set"))
|
||||
|
||||
meta = item.get_meta()
|
||||
meta = collection.get_meta()
|
||||
for tag in meta:
|
||||
if tag == "tag":
|
||||
continue
|
||||
|
@ -133,11 +140,11 @@ def xml_propfind_response(base_prefix, path, item, props, user, encoding,
|
|||
if clark_tag not in props:
|
||||
props.append(clark_tag)
|
||||
|
||||
responses = collections.defaultdict(list)
|
||||
responses: Dict[int, List[ET.Element]] = collections.defaultdict(list)
|
||||
if propname:
|
||||
for tag in props:
|
||||
responses[200].append(ET.Element(tag))
|
||||
props = ()
|
||||
props = []
|
||||
for tag in props:
|
||||
element = ET.Element(tag)
|
||||
is404 = False
|
||||
|
@ -152,25 +159,25 @@ def xml_propfind_response(base_prefix, path, item, props, user, encoding,
|
|||
else:
|
||||
is404 = True
|
||||
elif tag == xmlutils.make_clark("D:principal-collection-set"):
|
||||
tag = ET.Element(xmlutils.make_clark("D:href"))
|
||||
tag.text = xmlutils.make_href(base_prefix, "/")
|
||||
element.append(tag)
|
||||
child_element = ET.Element(xmlutils.make_clark("D:href"))
|
||||
child_element.text = xmlutils.make_href(base_prefix, "/")
|
||||
element.append(child_element)
|
||||
elif (tag in (xmlutils.make_clark("C:calendar-user-address-set"),
|
||||
xmlutils.make_clark("D:principal-URL"),
|
||||
xmlutils.make_clark("CR:addressbook-home-set"),
|
||||
xmlutils.make_clark("C:calendar-home-set")) and
|
||||
collection.is_principal and is_collection):
|
||||
tag = ET.Element(xmlutils.make_clark("D:href"))
|
||||
tag.text = xmlutils.make_href(base_prefix, path)
|
||||
element.append(tag)
|
||||
is_collection and collection.is_principal):
|
||||
child_element = ET.Element(xmlutils.make_clark("D:href"))
|
||||
child_element.text = xmlutils.make_href(base_prefix, path)
|
||||
element.append(child_element)
|
||||
elif tag == xmlutils.make_clark("C:supported-calendar-component-set"):
|
||||
human_tag = xmlutils.make_human_tag(tag)
|
||||
if is_collection and is_leaf:
|
||||
meta = item.get_meta(human_tag)
|
||||
if meta:
|
||||
components = meta.split(",")
|
||||
components_text = collection.get_meta(human_tag)
|
||||
if components_text:
|
||||
components = components_text.split(",")
|
||||
else:
|
||||
components = ("VTODO", "VEVENT", "VJOURNAL")
|
||||
components = ["VTODO", "VEVENT", "VJOURNAL"]
|
||||
for component in components:
|
||||
comp = ET.Element(xmlutils.make_clark("C:comp"))
|
||||
comp.set("name", component)
|
||||
|
@ -179,9 +186,10 @@ def xml_propfind_response(base_prefix, path, item, props, user, encoding,
|
|||
is404 = True
|
||||
elif tag == xmlutils.make_clark("D:current-user-principal"):
|
||||
if user:
|
||||
tag = ET.Element(xmlutils.make_clark("D:href"))
|
||||
tag.text = xmlutils.make_href(base_prefix, "/%s/" % user)
|
||||
element.append(tag)
|
||||
child_element = ET.Element(xmlutils.make_clark("D:href"))
|
||||
child_element.text = xmlutils.make_href(
|
||||
base_prefix, "/%s/" % user)
|
||||
element.append(child_element)
|
||||
else:
|
||||
element.append(ET.Element(
|
||||
xmlutils.make_clark("D:unauthenticated")))
|
||||
|
@ -204,18 +212,19 @@ def xml_propfind_response(base_prefix, path, item, props, user, encoding,
|
|||
"D:principal-property-search"]
|
||||
if is_collection and is_leaf:
|
||||
reports.append("D:sync-collection")
|
||||
if item.get_meta("tag") == "VADDRESSBOOK":
|
||||
if collection.tag == "VADDRESSBOOK":
|
||||
reports.append("CR:addressbook-multiget")
|
||||
reports.append("CR:addressbook-query")
|
||||
elif item.get_meta("tag") == "VCALENDAR":
|
||||
elif collection.tag == "VCALENDAR":
|
||||
reports.append("C:calendar-multiget")
|
||||
reports.append("C:calendar-query")
|
||||
for human_tag in reports:
|
||||
supported_report = ET.Element(
|
||||
xmlutils.make_clark("D:supported-report"))
|
||||
report_tag = ET.Element(xmlutils.make_clark("D:report"))
|
||||
report_tag.append(ET.Element(xmlutils.make_clark(human_tag)))
|
||||
supported_report.append(report_tag)
|
||||
report_element = ET.Element(xmlutils.make_clark("D:report"))
|
||||
report_element.append(
|
||||
ET.Element(xmlutils.make_clark(human_tag)))
|
||||
supported_report.append(report_element)
|
||||
element.append(supported_report)
|
||||
elif tag == xmlutils.make_clark("D:getcontentlength"):
|
||||
if not is_collection or is_leaf:
|
||||
|
@ -225,64 +234,68 @@ def xml_propfind_response(base_prefix, path, item, props, user, encoding,
|
|||
elif tag == xmlutils.make_clark("D:owner"):
|
||||
# return empty elment, if no owner available (rfc3744-5.1)
|
||||
if collection.owner:
|
||||
tag = ET.Element(xmlutils.make_clark("D:href"))
|
||||
tag.text = xmlutils.make_href(
|
||||
child_element = ET.Element(xmlutils.make_clark("D:href"))
|
||||
child_element.text = xmlutils.make_href(
|
||||
base_prefix, "/%s/" % collection.owner)
|
||||
element.append(tag)
|
||||
element.append(child_element)
|
||||
elif is_collection:
|
||||
if tag == xmlutils.make_clark("D:getcontenttype"):
|
||||
if is_leaf:
|
||||
element.text = xmlutils.MIMETYPES[item.get_meta("tag")]
|
||||
element.text = xmlutils.MIMETYPES[
|
||||
collection.tag]
|
||||
else:
|
||||
is404 = True
|
||||
elif tag == xmlutils.make_clark("D:resourcetype"):
|
||||
if item.is_principal:
|
||||
tag = ET.Element(xmlutils.make_clark("D:principal"))
|
||||
element.append(tag)
|
||||
if collection.is_principal:
|
||||
child_element = ET.Element(
|
||||
xmlutils.make_clark("D:principal"))
|
||||
element.append(child_element)
|
||||
if is_leaf:
|
||||
if item.get_meta("tag") == "VADDRESSBOOK":
|
||||
tag = ET.Element(
|
||||
if collection.tag == "VADDRESSBOOK":
|
||||
child_element = ET.Element(
|
||||
xmlutils.make_clark("CR:addressbook"))
|
||||
element.append(tag)
|
||||
elif item.get_meta("tag") == "VCALENDAR":
|
||||
tag = ET.Element(xmlutils.make_clark("C:calendar"))
|
||||
element.append(tag)
|
||||
tag = ET.Element(xmlutils.make_clark("D:collection"))
|
||||
element.append(tag)
|
||||
element.append(child_element)
|
||||
elif collection.tag == "VCALENDAR":
|
||||
child_element = ET.Element(
|
||||
xmlutils.make_clark("C:calendar"))
|
||||
element.append(child_element)
|
||||
child_element = ET.Element(xmlutils.make_clark("D:collection"))
|
||||
element.append(child_element)
|
||||
elif tag == xmlutils.make_clark("RADICALE:displayname"):
|
||||
# Only for internal use by the web interface
|
||||
displayname = item.get_meta("D:displayname")
|
||||
displayname = collection.get_meta("D:displayname")
|
||||
if displayname is not None:
|
||||
element.text = displayname
|
||||
else:
|
||||
is404 = True
|
||||
elif tag == xmlutils.make_clark("D:displayname"):
|
||||
displayname = item.get_meta("D:displayname")
|
||||
displayname = collection.get_meta("D:displayname")
|
||||
if not displayname and is_leaf:
|
||||
displayname = item.path
|
||||
displayname = collection.path
|
||||
if displayname is not None:
|
||||
element.text = displayname
|
||||
else:
|
||||
is404 = True
|
||||
elif tag == xmlutils.make_clark("CS:getctag"):
|
||||
if is_leaf:
|
||||
element.text = item.etag
|
||||
element.text = collection.etag
|
||||
else:
|
||||
is404 = True
|
||||
elif tag == xmlutils.make_clark("D:sync-token"):
|
||||
if is_leaf:
|
||||
element.text, _ = item.sync()
|
||||
element.text, _ = collection.sync()
|
||||
else:
|
||||
is404 = True
|
||||
else:
|
||||
human_tag = xmlutils.make_human_tag(tag)
|
||||
meta = item.get_meta(human_tag)
|
||||
if meta is not None:
|
||||
element.text = meta
|
||||
tag_text = collection.get_meta(human_tag)
|
||||
if tag_text is not None:
|
||||
element.text = tag_text
|
||||
else:
|
||||
is404 = True
|
||||
# Not for collections
|
||||
elif tag == xmlutils.make_clark("D:getcontenttype"):
|
||||
assert not isinstance(item, storage.BaseCollection)
|
||||
element.text = xmlutils.get_content_type(item, encoding)
|
||||
elif tag == xmlutils.make_clark("D:resourcetype"):
|
||||
# resourcetype must be returned empty for non-collection elements
|
||||
|
@ -307,13 +320,16 @@ def xml_propfind_response(base_prefix, path, item, props, user, encoding,
|
|||
return response
|
||||
|
||||
|
||||
class ApplicationPropfindMixin:
|
||||
def _collect_allowed_items(self, items, user):
|
||||
class ApplicationPartPropfind(ApplicationBase):
|
||||
|
||||
def _collect_allowed_items(
|
||||
self, items: Iterable[types.CollectionOrItem], user: str
|
||||
) -> Iterator[Tuple[types.CollectionOrItem, str]]:
|
||||
"""Get items from request that user is allowed to access."""
|
||||
for item in items:
|
||||
if isinstance(item, storage.BaseCollection):
|
||||
path = pathutils.unstrip_path(item.path, True)
|
||||
if item.get_meta("tag"):
|
||||
if item.tag:
|
||||
permissions = rights.intersect(
|
||||
self._rights.authorization(user, path), "rw")
|
||||
target = "collection with tag %r" % item.path
|
||||
|
@ -322,6 +338,7 @@ class ApplicationPropfindMixin:
|
|||
self._rights.authorization(user, path), "RW")
|
||||
target = "collection %r" % item.path
|
||||
else:
|
||||
assert item.collection is not None
|
||||
path = pathutils.unstrip_path(item.collection.path, True)
|
||||
permissions = rights.intersect(
|
||||
self._rights.authorization(user, path), "rw")
|
||||
|
@ -341,37 +358,37 @@ class ApplicationPropfindMixin:
|
|||
if permission:
|
||||
yield item, permission
|
||||
|
||||
def do_PROPFIND(self, environ, base_prefix, path, user):
|
||||
def do_PROPFIND(self, environ: types.WSGIEnviron, base_prefix: str,
|
||||
path: str, user: str) -> types.WSGIResponse:
|
||||
"""Manage PROPFIND request."""
|
||||
access = app.Access(self._rights, user, path)
|
||||
access = Access(self._rights, user, path)
|
||||
if not access.check("r"):
|
||||
return httputils.NOT_ALLOWED
|
||||
try:
|
||||
xml_content = self._read_xml_content(environ)
|
||||
xml_content = self._read_xml_request_body(environ)
|
||||
except RuntimeError as e:
|
||||
logger.warning(
|
||||
"Bad PROPFIND request on %r: %s", path, e, exc_info=True)
|
||||
return httputils.BAD_REQUEST
|
||||
except socket.timeout:
|
||||
logger.debug("client timed out", exc_info=True)
|
||||
logger.debug("Client timed out", exc_info=True)
|
||||
return httputils.REQUEST_TIMEOUT
|
||||
with self._storage.acquire_lock("r", user):
|
||||
items = self._storage.discover(
|
||||
path, environ.get("HTTP_DEPTH", "0"))
|
||||
items_iter = iter(self._storage.discover(
|
||||
path, environ.get("HTTP_DEPTH", "0")))
|
||||
# take root item for rights checking
|
||||
item = next(items, None)
|
||||
item = next(items_iter, None)
|
||||
if not item:
|
||||
return httputils.NOT_FOUND
|
||||
if not access.check("r", item):
|
||||
return httputils.NOT_ALLOWED
|
||||
# put item back
|
||||
items = itertools.chain([item], items)
|
||||
allowed_items = self._collect_allowed_items(items, user)
|
||||
items_iter = itertools.chain([item], items_iter)
|
||||
allowed_items = self._collect_allowed_items(items_iter, user)
|
||||
headers = {"DAV": httputils.DAV_HEADERS,
|
||||
"Content-Type": "text/xml; charset=%s" % self._encoding}
|
||||
status, xml_answer = xml_propfind(
|
||||
base_prefix, path, xml_content, allowed_items, user,
|
||||
self._encoding)
|
||||
if status == client.FORBIDDEN and xml_answer is None:
|
||||
xml_answer = xml_propfind(base_prefix, path, xml_content,
|
||||
allowed_items, user, self._encoding)
|
||||
if xml_answer is None:
|
||||
return httputils.NOT_ALLOWED
|
||||
return status, headers, self._write_xml_content(xml_answer)
|
||||
return client.MULTI_STATUS, headers, self._xml_response(xml_answer)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -20,17 +20,22 @@
|
|||
import contextlib
|
||||
import posixpath
|
||||
import socket
|
||||
import xml.etree.ElementTree as ET
|
||||
from http import client
|
||||
from typing import Callable, Iterable, Iterator, Optional, Sequence, Tuple
|
||||
from urllib.parse import unquote, urlparse
|
||||
from xml.etree import ElementTree as ET
|
||||
|
||||
from radicale import app, httputils, pathutils, storage, xmlutils
|
||||
import radicale.item as radicale_item
|
||||
from radicale import httputils, pathutils, storage, types, xmlutils
|
||||
from radicale.app.base import Access, ApplicationBase
|
||||
from radicale.item import filter as radicale_filter
|
||||
from radicale.log import logger
|
||||
|
||||
|
||||
def xml_report(base_prefix, path, xml_request, collection, encoding,
|
||||
unlock_storage_fn):
|
||||
def xml_report(base_prefix: str, path: str, xml_request: Optional[ET.Element],
|
||||
collection: storage.BaseCollection, encoding: str,
|
||||
unlock_storage_fn: Callable[[], None]
|
||||
) -> Tuple[int, ET.Element]:
|
||||
"""Read and answer REPORT requests.
|
||||
|
||||
Read rfc3253-3.6 for info.
|
||||
|
@ -40,10 +45,9 @@ def xml_report(base_prefix, path, xml_request, collection, encoding,
|
|||
if xml_request is None:
|
||||
return client.MULTI_STATUS, multistatus
|
||||
root = xml_request
|
||||
if root.tag in (
|
||||
xmlutils.make_clark("D:principal-search-property-set"),
|
||||
xmlutils.make_clark("D:principal-property-search"),
|
||||
xmlutils.make_clark("D:expand-property")):
|
||||
if root.tag in (xmlutils.make_clark("D:principal-search-property-set"),
|
||||
xmlutils.make_clark("D:principal-property-search"),
|
||||
xmlutils.make_clark("D:expand-property")):
|
||||
# We don't support searching for principals or indirect retrieving of
|
||||
# properties, just return an empty result.
|
||||
# InfCloud asks for expand-property reports (even if we don't announce
|
||||
|
@ -52,28 +56,28 @@ def xml_report(base_prefix, path, xml_request, collection, encoding,
|
|||
xmlutils.make_human_tag(root.tag), path)
|
||||
return client.MULTI_STATUS, multistatus
|
||||
if (root.tag == xmlutils.make_clark("C:calendar-multiget") and
|
||||
collection.get_meta("tag") != "VCALENDAR" or
|
||||
collection.tag != "VCALENDAR" or
|
||||
root.tag == xmlutils.make_clark("CR:addressbook-multiget") and
|
||||
collection.get_meta("tag") != "VADDRESSBOOK" or
|
||||
collection.tag != "VADDRESSBOOK" or
|
||||
root.tag == xmlutils.make_clark("D:sync-collection") and
|
||||
collection.get_meta("tag") not in ("VADDRESSBOOK", "VCALENDAR")):
|
||||
collection.tag not in ("VADDRESSBOOK", "VCALENDAR")):
|
||||
logger.warning("Invalid REPORT method %r on %r requested",
|
||||
xmlutils.make_human_tag(root.tag), path)
|
||||
return (client.FORBIDDEN,
|
||||
xmlutils.webdav_error("D:supported-report"))
|
||||
return client.FORBIDDEN, xmlutils.webdav_error("D:supported-report")
|
||||
prop_element = root.find(xmlutils.make_clark("D:prop"))
|
||||
props = (
|
||||
[prop.tag for prop in prop_element]
|
||||
if prop_element is not None else [])
|
||||
props = ([prop.tag for prop in prop_element]
|
||||
if prop_element is not None else [])
|
||||
|
||||
hreferences: Iterable[str]
|
||||
if root.tag in (
|
||||
xmlutils.make_clark("C:calendar-multiget"),
|
||||
xmlutils.make_clark("CR:addressbook-multiget")):
|
||||
# Read rfc4791-7.9 for info
|
||||
hreferences = set()
|
||||
for href_element in root.findall(xmlutils.make_clark("D:href")):
|
||||
href_path = pathutils.sanitize_path(
|
||||
unquote(urlparse(href_element.text).path))
|
||||
temp_url_path = urlparse(href_element.text).path
|
||||
assert isinstance(temp_url_path, str)
|
||||
href_path = pathutils.sanitize_path(unquote(temp_url_path))
|
||||
if (href_path + "/").startswith(base_prefix + "/"):
|
||||
hreferences.add(href_path[len(base_prefix):])
|
||||
else:
|
||||
|
@ -104,85 +108,16 @@ def xml_report(base_prefix, path, xml_request, collection, encoding,
|
|||
else:
|
||||
hreferences = (path,)
|
||||
filters = (
|
||||
root.findall("./%s" % xmlutils.make_clark("C:filter")) +
|
||||
root.findall("./%s" % xmlutils.make_clark("CR:filter")))
|
||||
|
||||
def retrieve_items(collection, hreferences, multistatus):
|
||||
"""Retrieves all items that are referenced in ``hreferences`` from
|
||||
``collection`` and adds 404 responses for missing and invalid items
|
||||
to ``multistatus``."""
|
||||
collection_requested = False
|
||||
|
||||
def get_names():
|
||||
"""Extracts all names from references in ``hreferences`` and adds
|
||||
404 responses for invalid references to ``multistatus``.
|
||||
If the whole collections is referenced ``collection_requested``
|
||||
gets set to ``True``."""
|
||||
nonlocal collection_requested
|
||||
for hreference in hreferences:
|
||||
try:
|
||||
name = pathutils.name_from_path(hreference, collection)
|
||||
except ValueError as e:
|
||||
logger.warning("Skipping invalid path %r in REPORT request"
|
||||
" on %r: %s", hreference, path, e)
|
||||
response = xml_item_response(base_prefix, hreference,
|
||||
found_item=False)
|
||||
multistatus.append(response)
|
||||
continue
|
||||
if name:
|
||||
# Reference is an item
|
||||
yield name
|
||||
else:
|
||||
# Reference is a collection
|
||||
collection_requested = True
|
||||
|
||||
for name, item in collection.get_multi(get_names()):
|
||||
if not item:
|
||||
uri = pathutils.unstrip_path(
|
||||
posixpath.join(collection.path, name))
|
||||
response = xml_item_response(base_prefix, uri,
|
||||
found_item=False)
|
||||
multistatus.append(response)
|
||||
else:
|
||||
yield item, False
|
||||
if collection_requested:
|
||||
yield from collection.get_filtered(filters)
|
||||
root.findall(xmlutils.make_clark("C:filter")) +
|
||||
root.findall(xmlutils.make_clark("CR:filter")))
|
||||
|
||||
# Retrieve everything required for finishing the request.
|
||||
retrieved_items = list(retrieve_items(collection, hreferences,
|
||||
multistatus))
|
||||
collection_tag = collection.get_meta("tag")
|
||||
# Don't access storage after this!
|
||||
retrieved_items = list(retrieve_items(
|
||||
base_prefix, path, collection, hreferences, filters, multistatus))
|
||||
collection_tag = collection.tag
|
||||
# !!! Don't access storage after this !!!
|
||||
unlock_storage_fn()
|
||||
|
||||
def match(item, filter_):
|
||||
tag = collection_tag
|
||||
if (tag == "VCALENDAR" and
|
||||
filter_.tag != xmlutils.make_clark("C:%s" % filter_)):
|
||||
if len(filter_) == 0:
|
||||
return True
|
||||
if len(filter_) > 1:
|
||||
raise ValueError("Filter with %d children" % len(filter_))
|
||||
if filter_[0].tag != xmlutils.make_clark("C:comp-filter"):
|
||||
raise ValueError("Unexpected %r in filter" % filter_[0].tag)
|
||||
return radicale_filter.comp_match(item, filter_[0])
|
||||
if (tag == "VADDRESSBOOK" and
|
||||
filter_.tag != xmlutils.make_clark("CR:%s" % filter_)):
|
||||
for child in filter_:
|
||||
if child.tag != xmlutils.make_clark("CR:prop-filter"):
|
||||
raise ValueError("Unexpected %r in filter" % child.tag)
|
||||
test = filter_.get("test", "anyof")
|
||||
if test == "anyof":
|
||||
return any(
|
||||
radicale_filter.prop_match(item.vobject_item, f, "CR")
|
||||
for f in filter_)
|
||||
if test == "allof":
|
||||
return all(
|
||||
radicale_filter.prop_match(item.vobject_item, f, "CR")
|
||||
for f in filter_)
|
||||
raise ValueError("Unsupported filter test: %r" % test)
|
||||
raise ValueError("unsupported filter %r for %r" % (filter_.tag, tag))
|
||||
|
||||
while retrieved_items:
|
||||
# ``item.vobject_item`` might be accessed during filtering.
|
||||
# Don't keep reference to ``item``, because VObject requires a lot of
|
||||
|
@ -190,7 +125,8 @@ def xml_report(base_prefix, path, xml_request, collection, encoding,
|
|||
item, filters_matched = retrieved_items.pop(0)
|
||||
if filters and not filters_matched:
|
||||
try:
|
||||
if not all(match(item, filter_) for filter_ in filters):
|
||||
if not all(test_filter(collection_tag, item, filter_)
|
||||
for filter_ in filters):
|
||||
continue
|
||||
except ValueError as e:
|
||||
raise ValueError("Failed to filter item %r from %r: %s" %
|
||||
|
@ -218,6 +154,7 @@ def xml_report(base_prefix, path, xml_request, collection, encoding,
|
|||
else:
|
||||
not_found_props.append(element)
|
||||
|
||||
assert item.href
|
||||
uri = pathutils.unstrip_path(
|
||||
posixpath.join(collection.path, item.href))
|
||||
multistatus.append(xml_item_response(
|
||||
|
@ -227,13 +164,15 @@ def xml_report(base_prefix, path, xml_request, collection, encoding,
|
|||
return client.MULTI_STATUS, multistatus
|
||||
|
||||
|
||||
def xml_item_response(base_prefix, href, found_props=(), not_found_props=(),
|
||||
found_item=True):
|
||||
def xml_item_response(base_prefix: str, href: str,
|
||||
found_props: Sequence[ET.Element] = (),
|
||||
not_found_props: Sequence[ET.Element] = (),
|
||||
found_item: bool = True) -> ET.Element:
|
||||
response = ET.Element(xmlutils.make_clark("D:response"))
|
||||
|
||||
href_tag = ET.Element(xmlutils.make_clark("D:href"))
|
||||
href_tag.text = xmlutils.make_href(base_prefix, href)
|
||||
response.append(href_tag)
|
||||
href_element = ET.Element(xmlutils.make_clark("D:href"))
|
||||
href_element.text = xmlutils.make_href(base_prefix, href)
|
||||
response.append(href_element)
|
||||
|
||||
if found_item:
|
||||
for code, props in ((200, found_props), (404, not_found_props)):
|
||||
|
@ -241,10 +180,10 @@ def xml_item_response(base_prefix, href, found_props=(), not_found_props=(),
|
|||
propstat = ET.Element(xmlutils.make_clark("D:propstat"))
|
||||
status = ET.Element(xmlutils.make_clark("D:status"))
|
||||
status.text = xmlutils.make_response(code)
|
||||
prop_tag = ET.Element(xmlutils.make_clark("D:prop"))
|
||||
prop_element = ET.Element(xmlutils.make_clark("D:prop"))
|
||||
for prop in props:
|
||||
prop_tag.append(prop)
|
||||
propstat.append(prop_tag)
|
||||
prop_element.append(prop)
|
||||
propstat.append(prop_element)
|
||||
propstat.append(status)
|
||||
response.append(propstat)
|
||||
else:
|
||||
|
@ -255,24 +194,98 @@ def xml_item_response(base_prefix, href, found_props=(), not_found_props=(),
|
|||
return response
|
||||
|
||||
|
||||
class ApplicationReportMixin:
|
||||
def do_REPORT(self, environ, base_prefix, path, user):
|
||||
def retrieve_items(
|
||||
base_prefix: str, path: str, collection: storage.BaseCollection,
|
||||
hreferences: Iterable[str], filters: Sequence[ET.Element],
|
||||
multistatus: ET.Element) -> Iterator[Tuple[radicale_item.Item, bool]]:
|
||||
"""Retrieves all items that are referenced in ``hreferences`` from
|
||||
``collection`` and adds 404 responses for missing and invalid items
|
||||
to ``multistatus``."""
|
||||
collection_requested = False
|
||||
|
||||
def get_names() -> Iterator[str]:
|
||||
"""Extracts all names from references in ``hreferences`` and adds
|
||||
404 responses for invalid references to ``multistatus``.
|
||||
If the whole collections is referenced ``collection_requested``
|
||||
gets set to ``True``."""
|
||||
nonlocal collection_requested
|
||||
for hreference in hreferences:
|
||||
try:
|
||||
name = pathutils.name_from_path(hreference, collection)
|
||||
except ValueError as e:
|
||||
logger.warning("Skipping invalid path %r in REPORT request on "
|
||||
"%r: %s", hreference, path, e)
|
||||
response = xml_item_response(base_prefix, hreference,
|
||||
found_item=False)
|
||||
multistatus.append(response)
|
||||
continue
|
||||
if name:
|
||||
# Reference is an item
|
||||
yield name
|
||||
else:
|
||||
# Reference is a collection
|
||||
collection_requested = True
|
||||
|
||||
for name, item in collection.get_multi(get_names()):
|
||||
if not item:
|
||||
uri = pathutils.unstrip_path(posixpath.join(collection.path, name))
|
||||
response = xml_item_response(base_prefix, uri, found_item=False)
|
||||
multistatus.append(response)
|
||||
else:
|
||||
yield item, False
|
||||
if collection_requested:
|
||||
yield from collection.get_filtered(filters)
|
||||
|
||||
|
||||
def test_filter(collection_tag: str, item: radicale_item.Item,
|
||||
filter_: ET.Element) -> bool:
|
||||
"""Match an item against a filter."""
|
||||
if (collection_tag == "VCALENDAR" and
|
||||
filter_.tag != xmlutils.make_clark("C:%s" % filter_)):
|
||||
if len(filter_) == 0:
|
||||
return True
|
||||
if len(filter_) > 1:
|
||||
raise ValueError("Filter with %d children" % len(filter_))
|
||||
if filter_[0].tag != xmlutils.make_clark("C:comp-filter"):
|
||||
raise ValueError("Unexpected %r in filter" % filter_[0].tag)
|
||||
return radicale_filter.comp_match(item, filter_[0])
|
||||
if (collection_tag == "VADDRESSBOOK" and
|
||||
filter_.tag != xmlutils.make_clark("CR:%s" % filter_)):
|
||||
for child in filter_:
|
||||
if child.tag != xmlutils.make_clark("CR:prop-filter"):
|
||||
raise ValueError("Unexpected %r in filter" % child.tag)
|
||||
test = filter_.get("test", "anyof")
|
||||
if test == "anyof":
|
||||
return any(radicale_filter.prop_match(item.vobject_item, f, "CR")
|
||||
for f in filter_)
|
||||
if test == "allof":
|
||||
return all(radicale_filter.prop_match(item.vobject_item, f, "CR")
|
||||
for f in filter_)
|
||||
raise ValueError("Unsupported filter test: %r" % test)
|
||||
raise ValueError("Unsupported filter %r for %r" %
|
||||
(filter_.tag, collection_tag))
|
||||
|
||||
|
||||
class ApplicationPartReport(ApplicationBase):
|
||||
|
||||
def do_REPORT(self, environ: types.WSGIEnviron, base_prefix: str,
|
||||
path: str, user: str) -> types.WSGIResponse:
|
||||
"""Manage REPORT request."""
|
||||
access = app.Access(self._rights, user, path)
|
||||
access = Access(self._rights, user, path)
|
||||
if not access.check("r"):
|
||||
return httputils.NOT_ALLOWED
|
||||
try:
|
||||
xml_content = self._read_xml_content(environ)
|
||||
xml_content = self._read_xml_request_body(environ)
|
||||
except RuntimeError as e:
|
||||
logger.warning(
|
||||
"Bad REPORT request on %r: %s", path, e, exc_info=True)
|
||||
logger.warning("Bad REPORT request on %r: %s", path, e,
|
||||
exc_info=True)
|
||||
return httputils.BAD_REQUEST
|
||||
except socket.timeout:
|
||||
logger.debug("client timed out", exc_info=True)
|
||||
logger.debug("Client timed out", exc_info=True)
|
||||
return httputils.REQUEST_TIMEOUT
|
||||
with contextlib.ExitStack() as lock_stack:
|
||||
lock_stack.enter_context(self._storage.acquire_lock("r", user))
|
||||
item = next(self._storage.discover(path), None)
|
||||
item = next(iter(self._storage.discover(path)), None)
|
||||
if not item:
|
||||
return httputils.NOT_FOUND
|
||||
if not access.check("r", item):
|
||||
|
@ -280,8 +293,8 @@ class ApplicationReportMixin:
|
|||
if isinstance(item, storage.BaseCollection):
|
||||
collection = item
|
||||
else:
|
||||
assert item.collection is not None
|
||||
collection = item.collection
|
||||
headers = {"Content-Type": "text/xml; charset=%s" % self._encoding}
|
||||
try:
|
||||
status, xml_answer = xml_report(
|
||||
base_prefix, path, xml_content, collection, self._encoding,
|
||||
|
@ -290,4 +303,5 @@ class ApplicationReportMixin:
|
|||
logger.warning(
|
||||
"Bad REPORT request on %r: %s", path, e, exc_info=True)
|
||||
return httputils.BAD_REQUEST
|
||||
return (status, headers, self._write_xml_content(xml_answer))
|
||||
headers = {"Content-Type": "text/xml; charset=%s" % self._encoding}
|
||||
return status, headers, self._xml_response(xml_answer)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -28,18 +28,23 @@ Take a look at the class ``BaseAuth`` if you want to implement your own.
|
|||
|
||||
"""
|
||||
|
||||
from radicale import utils
|
||||
from typing import Sequence, Tuple, Union
|
||||
|
||||
INTERNAL_TYPES = ("none", "remote_user", "http_x_remote_user", "htpasswd")
|
||||
from radicale import config, types, utils
|
||||
|
||||
INTERNAL_TYPES: Sequence[str] = ("none", "remote_user", "http_x_remote_user",
|
||||
"htpasswd")
|
||||
|
||||
|
||||
def load(configuration):
|
||||
def load(configuration: "config.Configuration") -> "BaseAuth":
|
||||
"""Load the authentication module chosen in configuration."""
|
||||
return utils.load_plugin(INTERNAL_TYPES, "auth", "Auth", configuration)
|
||||
return utils.load_plugin(INTERNAL_TYPES, "auth", "Auth", BaseAuth,
|
||||
configuration)
|
||||
|
||||
|
||||
class BaseAuth:
|
||||
def __init__(self, configuration):
|
||||
|
||||
def __init__(self, configuration: "config.Configuration") -> None:
|
||||
"""Initialize BaseAuth.
|
||||
|
||||
``configuration`` see ``radicale.config`` module.
|
||||
|
@ -49,7 +54,8 @@ class BaseAuth:
|
|||
"""
|
||||
self.configuration = configuration
|
||||
|
||||
def get_external_login(self, environ):
|
||||
def get_external_login(self, environ: types.WSGIEnviron) -> Union[
|
||||
Tuple[()], Tuple[str, str]]:
|
||||
"""Optionally provide the login and password externally.
|
||||
|
||||
``environ`` a dict with the WSGI environment
|
||||
|
@ -61,14 +67,14 @@ class BaseAuth:
|
|||
"""
|
||||
return ()
|
||||
|
||||
def login(self, login, password):
|
||||
def login(self, login: str, password: str) -> str:
|
||||
"""Check credentials and map login to internal user
|
||||
|
||||
``login`` the login name
|
||||
|
||||
``password`` the password
|
||||
|
||||
Returns the user name or ``""`` for invalid credentials.
|
||||
Returns the username or ``""`` for invalid credentials.
|
||||
|
||||
"""
|
||||
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -49,18 +49,23 @@ When passlib[bcrypt] is installed:
|
|||
|
||||
import functools
|
||||
import hmac
|
||||
from typing import Any
|
||||
|
||||
from passlib.hash import apr_md5_crypt
|
||||
|
||||
from radicale import auth
|
||||
from radicale import auth, config
|
||||
|
||||
|
||||
class Auth(auth.BaseAuth):
|
||||
def __init__(self, configuration):
|
||||
|
||||
_filename: str
|
||||
_encoding: str
|
||||
|
||||
def __init__(self, configuration: config.Configuration) -> None:
|
||||
super().__init__(configuration)
|
||||
self._filename = configuration.get("auth", "htpasswd_filename")
|
||||
self._encoding = self.configuration.get("encoding", "stock")
|
||||
encryption = configuration.get("auth", "htpasswd_encryption")
|
||||
self._encoding = configuration.get("encoding", "stock")
|
||||
encryption: str = configuration.get("auth", "htpasswd_encryption")
|
||||
|
||||
if encryption == "plain":
|
||||
self._verify = self._plain
|
||||
|
@ -82,17 +87,17 @@ class Auth(auth.BaseAuth):
|
|||
raise RuntimeError("The htpasswd encryption method %r is not "
|
||||
"supported." % encryption)
|
||||
|
||||
def _plain(self, hash_value, password):
|
||||
def _plain(self, hash_value: str, password: str) -> bool:
|
||||
"""Check if ``hash_value`` and ``password`` match, plain method."""
|
||||
return hmac.compare_digest(hash_value.encode(), password.encode())
|
||||
|
||||
def _bcrypt(self, bcrypt, hash_value, password):
|
||||
def _bcrypt(self, bcrypt: Any, hash_value: str, password: str) -> bool:
|
||||
return bcrypt.verify(password, hash_value.strip())
|
||||
|
||||
def _md5apr1(self, hash_value, password):
|
||||
def _md5apr1(self, hash_value: str, password: str) -> bool:
|
||||
return apr_md5_crypt.verify(password, hash_value.strip())
|
||||
|
||||
def login(self, login, password):
|
||||
def login(self, login: str, password: str) -> str:
|
||||
"""Validate credentials.
|
||||
|
||||
Iterate through htpasswd credential file until login matches, extract
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -26,9 +26,14 @@ if the reverse proxy is not configured properly.
|
|||
|
||||
"""
|
||||
|
||||
import radicale.auth.none as none
|
||||
from typing import Tuple, Union
|
||||
|
||||
from radicale import types
|
||||
from radicale.auth import none
|
||||
|
||||
|
||||
class Auth(none.Auth):
|
||||
def get_external_login(self, environ):
|
||||
|
||||
def get_external_login(self, environ: types.WSGIEnviron) -> Union[
|
||||
Tuple[()], Tuple[str, str]]:
|
||||
return environ.get("HTTP_X_REMOTE_USER", ""), ""
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -26,5 +26,6 @@ from radicale import auth
|
|||
|
||||
|
||||
class Auth(auth.BaseAuth):
|
||||
def login(self, login, password):
|
||||
|
||||
def login(self, login: str, password: str) -> str:
|
||||
return login
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -25,9 +25,14 @@ It's intended for use with an external WSGI server.
|
|||
|
||||
"""
|
||||
|
||||
import radicale.auth.none as none
|
||||
from typing import Tuple, Union
|
||||
|
||||
from radicale import types
|
||||
from radicale.auth import none
|
||||
|
||||
|
||||
class Auth(none.Auth):
|
||||
def get_external_login(self, environ):
|
||||
|
||||
def get_external_login(self, environ: types.WSGIEnviron
|
||||
) -> Union[Tuple[()], Tuple[str, str]]:
|
||||
return environ.get("REMOTE_USER", ""), ""
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -22,42 +22,193 @@ Helper functions for HTTP.
|
|||
|
||||
"""
|
||||
|
||||
import contextlib
|
||||
import os
|
||||
import pathlib
|
||||
import sys
|
||||
import time
|
||||
from http import client
|
||||
from typing import List, Mapping, Union, cast
|
||||
|
||||
NOT_ALLOWED = (
|
||||
from radicale import config, pathutils, types
|
||||
from radicale.log import logger
|
||||
|
||||
if sys.version_info < (3, 9):
|
||||
import pkg_resources
|
||||
|
||||
_TRAVERSABLE_LIKE_TYPE = pathlib.Path
|
||||
else:
|
||||
import importlib.abc
|
||||
from importlib import resources
|
||||
|
||||
_TRAVERSABLE_LIKE_TYPE = Union[importlib.abc.Traversable, pathlib.Path]
|
||||
|
||||
NOT_ALLOWED: types.WSGIResponse = (
|
||||
client.FORBIDDEN, (("Content-Type", "text/plain"),),
|
||||
"Access to the requested resource forbidden.")
|
||||
FORBIDDEN = (
|
||||
FORBIDDEN: types.WSGIResponse = (
|
||||
client.FORBIDDEN, (("Content-Type", "text/plain"),),
|
||||
"Action on the requested resource refused.")
|
||||
BAD_REQUEST = (
|
||||
BAD_REQUEST: types.WSGIResponse = (
|
||||
client.BAD_REQUEST, (("Content-Type", "text/plain"),), "Bad Request")
|
||||
NOT_FOUND = (
|
||||
NOT_FOUND: types.WSGIResponse = (
|
||||
client.NOT_FOUND, (("Content-Type", "text/plain"),),
|
||||
"The requested resource could not be found.")
|
||||
CONFLICT = (
|
||||
CONFLICT: types.WSGIResponse = (
|
||||
client.CONFLICT, (("Content-Type", "text/plain"),),
|
||||
"Conflict in the request.")
|
||||
METHOD_NOT_ALLOWED = (
|
||||
METHOD_NOT_ALLOWED: types.WSGIResponse = (
|
||||
client.METHOD_NOT_ALLOWED, (("Content-Type", "text/plain"),),
|
||||
"The method is not allowed on the requested resource.")
|
||||
PRECONDITION_FAILED = (
|
||||
PRECONDITION_FAILED: types.WSGIResponse = (
|
||||
client.PRECONDITION_FAILED,
|
||||
(("Content-Type", "text/plain"),), "Precondition failed.")
|
||||
REQUEST_TIMEOUT = (
|
||||
REQUEST_TIMEOUT: types.WSGIResponse = (
|
||||
client.REQUEST_TIMEOUT, (("Content-Type", "text/plain"),),
|
||||
"Connection timed out.")
|
||||
REQUEST_ENTITY_TOO_LARGE = (
|
||||
REQUEST_ENTITY_TOO_LARGE: types.WSGIResponse = (
|
||||
client.REQUEST_ENTITY_TOO_LARGE, (("Content-Type", "text/plain"),),
|
||||
"Request body too large.")
|
||||
REMOTE_DESTINATION = (
|
||||
REMOTE_DESTINATION: types.WSGIResponse = (
|
||||
client.BAD_GATEWAY, (("Content-Type", "text/plain"),),
|
||||
"Remote destination not supported.")
|
||||
DIRECTORY_LISTING = (
|
||||
DIRECTORY_LISTING: types.WSGIResponse = (
|
||||
client.FORBIDDEN, (("Content-Type", "text/plain"),),
|
||||
"Directory listings are not supported.")
|
||||
INTERNAL_SERVER_ERROR = (
|
||||
INTERNAL_SERVER_ERROR: types.WSGIResponse = (
|
||||
client.INTERNAL_SERVER_ERROR, (("Content-Type", "text/plain"),),
|
||||
"A server error occurred. Please contact the administrator.")
|
||||
|
||||
DAV_HEADERS = "1, 2, 3, calendar-access, addressbook, extended-mkcol"
|
||||
DAV_HEADERS: str = "1, 2, 3, calendar-access, addressbook, extended-mkcol"
|
||||
|
||||
MIMETYPES: Mapping[str, str] = {
|
||||
".css": "text/css",
|
||||
".eot": "application/vnd.ms-fontobject",
|
||||
".gif": "image/gif",
|
||||
".html": "text/html",
|
||||
".js": "application/javascript",
|
||||
".manifest": "text/cache-manifest",
|
||||
".png": "image/png",
|
||||
".svg": "image/svg+xml",
|
||||
".ttf": "application/font-sfnt",
|
||||
".txt": "text/plain",
|
||||
".woff": "application/font-woff",
|
||||
".woff2": "font/woff2",
|
||||
".xml": "text/xml"}
|
||||
FALLBACK_MIMETYPE: str = "application/octet-stream"
|
||||
|
||||
|
||||
def decode_request(configuration: "config.Configuration",
|
||||
environ: types.WSGIEnviron, text: bytes) -> str:
|
||||
"""Try to magically decode ``text`` according to given ``environ``."""
|
||||
# List of charsets to try
|
||||
charsets: List[str] = []
|
||||
|
||||
# First append content charset given in the request
|
||||
content_type = environ.get("CONTENT_TYPE")
|
||||
if content_type and "charset=" in content_type:
|
||||
charsets.append(
|
||||
content_type.split("charset=")[1].split(";")[0].strip())
|
||||
# Then append default Radicale charset
|
||||
charsets.append(cast(str, configuration.get("encoding", "request")))
|
||||
# Then append various fallbacks
|
||||
charsets.append("utf-8")
|
||||
charsets.append("iso8859-1")
|
||||
# Remove duplicates
|
||||
for i, s in reversed(list(enumerate(charsets))):
|
||||
if s in charsets[:i]:
|
||||
del charsets[i]
|
||||
|
||||
# Try to decode
|
||||
for charset in charsets:
|
||||
with contextlib.suppress(UnicodeDecodeError):
|
||||
return text.decode(charset)
|
||||
raise UnicodeDecodeError("decode_request", text, 0, len(text),
|
||||
"all codecs failed [%s]" % ", ".join(charsets))
|
||||
|
||||
|
||||
def read_raw_request_body(configuration: "config.Configuration",
|
||||
environ: types.WSGIEnviron) -> bytes:
|
||||
content_length = int(environ.get("CONTENT_LENGTH") or 0)
|
||||
if not content_length:
|
||||
return b""
|
||||
content = environ["wsgi.input"].read(content_length)
|
||||
if len(content) < content_length:
|
||||
raise RuntimeError("Request body too short: %d" % len(content))
|
||||
return content
|
||||
|
||||
|
||||
def read_request_body(configuration: "config.Configuration",
|
||||
environ: types.WSGIEnviron) -> str:
|
||||
content = decode_request(configuration, environ,
|
||||
read_raw_request_body(configuration, environ))
|
||||
logger.debug("Request content:\n%s", content)
|
||||
return content
|
||||
|
||||
|
||||
def redirect(location: str, status: int = client.FOUND) -> types.WSGIResponse:
|
||||
return (status,
|
||||
{"Location": location, "Content-Type": "text/plain"},
|
||||
"Redirected to %s" % location)
|
||||
|
||||
|
||||
def _serve_traversable(
|
||||
traversable: _TRAVERSABLE_LIKE_TYPE, base_prefix: str, path: str,
|
||||
path_prefix: str, index_file: str, mimetypes: Mapping[str, str],
|
||||
fallback_mimetype: str) -> types.WSGIResponse:
|
||||
if path != path_prefix and not path.startswith(path_prefix):
|
||||
raise ValueError("path must start with path_prefix: %r --> %r" %
|
||||
(path_prefix, path))
|
||||
assert pathutils.sanitize_path(path) == path
|
||||
parts_path = path[len(path_prefix):].strip('/')
|
||||
parts = parts_path.split("/") if parts_path else []
|
||||
for part in parts:
|
||||
if not pathutils.is_safe_filesystem_path_component(part):
|
||||
logger.debug("Web content with unsafe path %r requested", path)
|
||||
return NOT_FOUND
|
||||
if (not traversable.is_dir() or
|
||||
all(part != entry.name for entry in traversable.iterdir())):
|
||||
return NOT_FOUND
|
||||
traversable = traversable.joinpath(part)
|
||||
if traversable.is_dir():
|
||||
if not path.endswith("/"):
|
||||
return redirect(base_prefix + path + "/")
|
||||
if not index_file:
|
||||
return NOT_FOUND
|
||||
traversable = traversable.joinpath(index_file)
|
||||
if not traversable.is_file():
|
||||
return NOT_FOUND
|
||||
content_type = MIMETYPES.get(
|
||||
os.path.splitext(traversable.name)[1].lower(), FALLBACK_MIMETYPE)
|
||||
headers = {"Content-Type": content_type}
|
||||
if isinstance(traversable, pathlib.Path):
|
||||
headers["Last-Modified"] = time.strftime(
|
||||
"%a, %d %b %Y %H:%M:%S GMT",
|
||||
time.gmtime(traversable.stat().st_mtime))
|
||||
answer = traversable.read_bytes()
|
||||
return client.OK, headers, answer
|
||||
|
||||
|
||||
def serve_resource(
|
||||
package: str, resource: str, base_prefix: str, path: str,
|
||||
path_prefix: str = "/.web", index_file: str = "index.html",
|
||||
mimetypes: Mapping[str, str] = MIMETYPES,
|
||||
fallback_mimetype: str = FALLBACK_MIMETYPE) -> types.WSGIResponse:
|
||||
if sys.version_info < (3, 9):
|
||||
traversable = pathlib.Path(
|
||||
pkg_resources.resource_filename(package, resource))
|
||||
else:
|
||||
traversable = resources.files(package).joinpath(resource)
|
||||
return _serve_traversable(traversable, base_prefix, path, path_prefix,
|
||||
index_file, mimetypes, fallback_mimetype)
|
||||
|
||||
|
||||
def serve_folder(
|
||||
folder: str, base_prefix: str, path: str,
|
||||
path_prefix: str = "/.web", index_file: str = "index.html",
|
||||
mimetypes: Mapping[str, str] = MIMETYPES,
|
||||
fallback_mimetype: str = FALLBACK_MIMETYPE) -> types.WSGIResponse:
|
||||
# deprecated: use `serve_resource` instead
|
||||
traversable = pathlib.Path(folder)
|
||||
return _serve_traversable(traversable, base_prefix, path, path_prefix,
|
||||
index_file, mimetypes, fallback_mimetype)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
|
@ -24,30 +24,50 @@ Module for address books and calendar entries (see ``Item``).
|
|||
"""
|
||||
|
||||
import binascii
|
||||
import contextlib
|
||||
import math
|
||||
import os
|
||||
import sys
|
||||
from datetime import timedelta
|
||||
import re
|
||||
from datetime import datetime, timedelta
|
||||
from hashlib import sha256
|
||||
from itertools import chain
|
||||
from typing import (Any, Callable, List, MutableMapping, Optional, Sequence,
|
||||
Tuple)
|
||||
|
||||
import vobject
|
||||
|
||||
from radicale import pathutils
|
||||
from radicale import storage # noqa:F401
|
||||
from radicale.item import filter as radicale_filter
|
||||
from radicale.log import logger
|
||||
|
||||
|
||||
def predict_tag_of_parent_collection(vobject_items):
|
||||
def read_components(s: str) -> List[vobject.base.Component]:
|
||||
"""Wrapper for vobject.readComponents"""
|
||||
# Workaround for bug in InfCloud
|
||||
# PHOTO is a data URI
|
||||
s = re.sub(r"^(PHOTO(?:;[^:\r\n]*)?;ENCODING=b(?:;[^:\r\n]*)?:)"
|
||||
r"data:[^;,\r\n]*;base64,", r"\1", s,
|
||||
flags=re.MULTILINE | re.IGNORECASE)
|
||||
return list(vobject.readComponents(s))
|
||||
|
||||
|
||||
def predict_tag_of_parent_collection(
|
||||
vobject_items: Sequence[vobject.base.Component]) -> Optional[str]:
|
||||
"""Returns the predicted tag or `None`"""
|
||||
if len(vobject_items) != 1:
|
||||
return ""
|
||||
return None
|
||||
if vobject_items[0].name == "VCALENDAR":
|
||||
return "VCALENDAR"
|
||||
if vobject_items[0].name in ("VCARD", "VLIST"):
|
||||
return "VADDRESSBOOK"
|
||||
return ""
|
||||
return None
|
||||
|
||||
|
||||
def predict_tag_of_whole_collection(vobject_items, fallback_tag=None):
|
||||
def predict_tag_of_whole_collection(
|
||||
vobject_items: Sequence[vobject.base.Component],
|
||||
fallback_tag: Optional[str] = None) -> Optional[str]:
|
||||
"""Returns the predicted tag or `fallback_tag`"""
|
||||
if vobject_items and vobject_items[0].name == "VCALENDAR":
|
||||
return "VCALENDAR"
|
||||
if vobject_items and vobject_items[0].name in ("VCARD", "VLIST"):
|
||||
|
@ -58,9 +78,13 @@ def predict_tag_of_whole_collection(vobject_items, fallback_tag=None):
|
|||
return fallback_tag
|
||||
|
||||
|
||||
def check_and_sanitize_items(vobject_items, is_collection=False, tag=None):
|
||||
def check_and_sanitize_items(
|
||||
vobject_items: List[vobject.base.Component],
|
||||
is_collection: bool = False, tag: str = "") -> None:
|
||||
"""Check vobject items for common errors and add missing UIDs.
|
||||
|
||||
Modifies the list `vobject_items`.
|
||||
|
||||
``is_collection`` indicates that vobject_item contains unrelated
|
||||
components.
|
||||
|
||||
|
@ -130,12 +154,31 @@ def check_and_sanitize_items(vobject_items, is_collection=False, tag=None):
|
|||
logger.debug("Quirks: Removing zero duration from %s in "
|
||||
"object %r", component_name, component_uid)
|
||||
del component.duration
|
||||
# Workaround for Evolution
|
||||
# EXDATE has value DATE even if DTSTART/DTEND is DATE-TIME.
|
||||
# The RFC is vaguely formulated on the issue.
|
||||
# To resolve the issue convert EXDATE and RDATE to
|
||||
# the same type as DTDSTART
|
||||
if hasattr(component, "dtstart"):
|
||||
ref_date = component.dtstart.value
|
||||
ref_value_param = component.dtstart.params.get("VALUE")
|
||||
for dates in chain(component.contents.get("exdate", []),
|
||||
component.contents.get("rdate", [])):
|
||||
if all(type(d) == type(ref_date) for d in dates.value):
|
||||
continue
|
||||
for i, date in enumerate(dates.value):
|
||||
dates.value[i] = ref_date.replace(
|
||||
date.year, date.month, date.day)
|
||||
with contextlib.suppress(KeyError):
|
||||
del dates.params["VALUE"]
|
||||
if ref_value_param is not None:
|
||||
dates.params["VALUE"] = ref_value_param
|
||||
# vobject interprets recurrence rules on demand
|
||||
try:
|
||||
component.rruleset
|
||||
except Exception as e:
|
||||
raise ValueError("invalid recurrence rules in %s" %
|
||||
component.name) from e
|
||||
raise ValueError("Invalid recurrence rules in %s in object %r"
|
||||
% (component.name, component_uid)) from e
|
||||
elif tag == "VADDRESSBOOK":
|
||||
# https://tools.ietf.org/html/rfc6352#section-5.1
|
||||
object_uids = set()
|
||||
|
@ -164,19 +207,36 @@ def check_and_sanitize_items(vobject_items, is_collection=False, tag=None):
|
|||
else:
|
||||
vobject_item.add("UID").value = object_uid
|
||||
else:
|
||||
for i in vobject_items:
|
||||
for item in vobject_items:
|
||||
raise ValueError("Item type %r not supported in %s collection" %
|
||||
(i.name, repr(tag) if tag else "generic"))
|
||||
(item.name, repr(tag) if tag else "generic"))
|
||||
|
||||
|
||||
def check_and_sanitize_props(props):
|
||||
"""Check collection properties for common errors."""
|
||||
tag = props.get("tag")
|
||||
if tag and tag not in ("VCALENDAR", "VADDRESSBOOK"):
|
||||
raise ValueError("Unsupported collection tag: %r" % tag)
|
||||
def check_and_sanitize_props(props: MutableMapping[Any, Any]
|
||||
) -> MutableMapping[str, str]:
|
||||
"""Check collection properties for common errors.
|
||||
|
||||
Modifies the dict `props`.
|
||||
|
||||
"""
|
||||
for k, v in list(props.items()): # Make copy to be able to delete items
|
||||
if not isinstance(k, str):
|
||||
raise ValueError("Key must be %r not %r: %r" % (
|
||||
str.__name__, type(k).__name__, k))
|
||||
if not isinstance(v, str):
|
||||
if v is None:
|
||||
del props[k]
|
||||
continue
|
||||
raise ValueError("Value of %r must be %r not %r: %r" % (
|
||||
k, str.__name__, type(v).__name__, v))
|
||||
if k == "tag":
|
||||
if v not in ("", "VCALENDAR", "VADDRESSBOOK"):
|
||||
raise ValueError("Unsupported collection tag: %r" % v)
|
||||
return props
|
||||
|
||||
|
||||
def find_available_uid(exists_fn, suffix=""):
|
||||
def find_available_uid(exists_fn: Callable[[str], bool], suffix: str = ""
|
||||
) -> str:
|
||||
"""Generate a pseudo-random UID"""
|
||||
# Prevent infinite loop
|
||||
for _ in range(1000):
|
||||
|
@ -185,11 +245,11 @@ def find_available_uid(exists_fn, suffix=""):
|
|||
r[:8], r[8:12], r[12:16], r[16:20], r[20:], suffix)
|
||||
if not exists_fn(name):
|
||||
return name
|
||||
# something is wrong with the PRNG
|
||||
raise RuntimeError("No unique random sequence found")
|
||||
# Something is wrong with the PRNG or `exists_fn`
|
||||
raise RuntimeError("No available random UID found")
|
||||
|
||||
|
||||
def get_etag(text):
|
||||
def get_etag(text: str) -> str:
|
||||
"""Etag from collection or item.
|
||||
|
||||
Encoded as quoted-string (see RFC 2616).
|
||||
|
@ -200,13 +260,13 @@ def get_etag(text):
|
|||
return '"%s"' % etag.hexdigest()
|
||||
|
||||
|
||||
def get_uid(vobject_component):
|
||||
def get_uid(vobject_component: vobject.base.Component) -> str:
|
||||
"""UID value of an item if defined."""
|
||||
return (vobject_component.uid.value
|
||||
if hasattr(vobject_component, "uid") else None)
|
||||
return (vobject_component.uid.value or ""
|
||||
if hasattr(vobject_component, "uid") else "")
|
||||
|
||||
|
||||
def get_uid_from_object(vobject_item):
|
||||
def get_uid_from_object(vobject_item: vobject.base.Component) -> str:
|
||||
"""UID value of an calendar/addressbook object."""
|
||||
if vobject_item.name == "VCALENDAR":
|
||||
if hasattr(vobject_item, "vevent"):
|
||||
|
@ -217,10 +277,10 @@ def get_uid_from_object(vobject_item):
|
|||
return get_uid(vobject_item.vtodo)
|
||||
elif vobject_item.name == "VCARD":
|
||||
return get_uid(vobject_item)
|
||||
return None
|
||||
return ""
|
||||
|
||||
|
||||
def find_tag(vobject_item):
|
||||
def find_tag(vobject_item: vobject.base.Component) -> str:
|
||||
"""Find component name from ``vobject_item``."""
|
||||
if vobject_item.name == "VCALENDAR":
|
||||
for component in vobject_item.components():
|
||||
|
@ -229,22 +289,24 @@ def find_tag(vobject_item):
|
|||
return ""
|
||||
|
||||
|
||||
def find_tag_and_time_range(vobject_item):
|
||||
"""Find component name and enclosing time range from ``vobject item``.
|
||||
def find_time_range(vobject_item: vobject.base.Component, tag: str
|
||||
) -> Tuple[int, int]:
|
||||
"""Find enclosing time range from ``vobject item``.
|
||||
|
||||
Returns a tuple (``tag``, ``start``, ``end``) where ``tag`` is a string
|
||||
and ``start`` and ``end`` are POSIX timestamps (as int).
|
||||
``tag`` must be set to the return value of ``find_tag``.
|
||||
|
||||
Returns a tuple (``start``, ``end``) where ``start`` and ``end`` are
|
||||
POSIX timestamps.
|
||||
|
||||
This is intened to be used for matching against simplified prefilters.
|
||||
|
||||
"""
|
||||
tag = find_tag(vobject_item)
|
||||
if not tag:
|
||||
return (
|
||||
tag, radicale_filter.TIMESTAMP_MIN, radicale_filter.TIMESTAMP_MAX)
|
||||
return radicale_filter.TIMESTAMP_MIN, radicale_filter.TIMESTAMP_MAX
|
||||
start = end = None
|
||||
|
||||
def range_fn(range_start, range_end, is_recurrence):
|
||||
def range_fn(range_start: datetime, range_end: datetime,
|
||||
is_recurrence: bool) -> bool:
|
||||
nonlocal start, end
|
||||
if start is None or range_start < start:
|
||||
start = range_start
|
||||
|
@ -252,7 +314,7 @@ def find_tag_and_time_range(vobject_item):
|
|||
end = range_end
|
||||
return False
|
||||
|
||||
def infinity_fn(range_start):
|
||||
def infinity_fn(range_start: datetime) -> bool:
|
||||
nonlocal start, end
|
||||
if start is None or range_start < start:
|
||||
start = range_start
|
||||
|
@ -264,22 +326,37 @@ def find_tag_and_time_range(vobject_item):
|
|||
start = radicale_filter.DATETIME_MIN
|
||||
if end is None:
|
||||
end = radicale_filter.DATETIME_MAX
|
||||
try:
|
||||
return tag, math.floor(start.timestamp()), math.ceil(end.timestamp())
|
||||
except ValueError as e:
|
||||
if str(e) == ("offset must be a timedelta representing a whole "
|
||||
"number of minutes") and sys.version_info < (3, 6):
|
||||
raise RuntimeError("Unsupported in Python < 3.6: %s" % e) from e
|
||||
raise
|
||||
return math.floor(start.timestamp()), math.ceil(end.timestamp())
|
||||
|
||||
|
||||
class Item:
|
||||
"""Class for address book and calendar entries."""
|
||||
|
||||
def __init__(self, collection_path=None, collection=None,
|
||||
vobject_item=None, href=None, last_modified=None, text=None,
|
||||
etag=None, uid=None, name=None, component_name=None,
|
||||
time_range=None):
|
||||
collection: Optional["storage.BaseCollection"]
|
||||
href: Optional[str]
|
||||
last_modified: Optional[str]
|
||||
|
||||
_collection_path: str
|
||||
_text: Optional[str]
|
||||
_vobject_item: Optional[vobject.base.Component]
|
||||
_etag: Optional[str]
|
||||
_uid: Optional[str]
|
||||
_name: Optional[str]
|
||||
_component_name: Optional[str]
|
||||
_time_range: Optional[Tuple[int, int]]
|
||||
|
||||
def __init__(self,
|
||||
collection_path: Optional[str] = None,
|
||||
collection: Optional["storage.BaseCollection"] = None,
|
||||
vobject_item: Optional[vobject.base.Component] = None,
|
||||
href: Optional[str] = None,
|
||||
last_modified: Optional[str] = None,
|
||||
text: Optional[str] = None,
|
||||
etag: Optional[str] = None,
|
||||
uid: Optional[str] = None,
|
||||
name: Optional[str] = None,
|
||||
component_name: Optional[str] = None,
|
||||
time_range: Optional[Tuple[int, int]] = None):
|
||||
"""Initialize an item.
|
||||
|
||||
``collection_path`` the path of the parent collection (optional if
|
||||
|
@ -305,16 +382,15 @@ class Item:
|
|||
``component_name`` the name of the primary component (optional).
|
||||
See ``find_tag``.
|
||||
|
||||
``time_range`` the enclosing time range.
|
||||
See ``find_tag_and_time_range``.
|
||||
``time_range`` the enclosing time range. See ``find_time_range``.
|
||||
|
||||
"""
|
||||
if text is None and vobject_item is None:
|
||||
raise ValueError(
|
||||
"at least one of 'text' or 'vobject_item' must be set")
|
||||
"At least one of 'text' or 'vobject_item' must be set")
|
||||
if collection_path is None:
|
||||
if collection is None:
|
||||
raise ValueError("at least one of 'collection_path' or "
|
||||
raise ValueError("At least one of 'collection_path' or "
|
||||
"'collection' must be set")
|
||||
collection_path = collection.path
|
||||
assert collection_path == pathutils.strip_path(
|
||||
|
@ -331,7 +407,7 @@ class Item:
|
|||
self._component_name = component_name
|
||||
self._time_range = time_range
|
||||
|
||||
def serialize(self):
|
||||
def serialize(self) -> str:
|
||||
if self._text is None:
|
||||
try:
|
||||
self._text = self.vobject_item.serialize()
|
||||
|
@ -353,38 +429,38 @@ class Item:
|
|||
return self._vobject_item
|
||||
|
||||
@property
|
||||
def etag(self):
|
||||
def etag(self) -> str:
|
||||
"""Encoded as quoted-string (see RFC 2616)."""
|
||||
if self._etag is None:
|
||||
self._etag = get_etag(self.serialize())
|
||||
return self._etag
|
||||
|
||||
@property
|
||||
def uid(self):
|
||||
def uid(self) -> str:
|
||||
if self._uid is None:
|
||||
self._uid = get_uid_from_object(self.vobject_item)
|
||||
return self._uid
|
||||
|
||||
@property
|
||||
def name(self):
|
||||
def name(self) -> str:
|
||||
if self._name is None:
|
||||
self._name = self.vobject_item.name or ""
|
||||
return self._name
|
||||
|
||||
@property
|
||||
def component_name(self):
|
||||
if self._component_name is not None:
|
||||
return self._component_name
|
||||
return find_tag(self.vobject_item)
|
||||
def component_name(self) -> str:
|
||||
if self._component_name is None:
|
||||
self._component_name = find_tag(self.vobject_item)
|
||||
return self._component_name
|
||||
|
||||
@property
|
||||
def time_range(self):
|
||||
def time_range(self) -> Tuple[int, int]:
|
||||
if self._time_range is None:
|
||||
self._component_name, *self._time_range = (
|
||||
find_tag_and_time_range(self.vobject_item))
|
||||
self._time_range = find_time_range(
|
||||
self.vobject_item, self.component_name)
|
||||
return self._time_range
|
||||
|
||||
def prepare(self):
|
||||
def prepare(self) -> None:
|
||||
"""Fill cache with values."""
|
||||
orig_vobject_item = self._vobject_item
|
||||
self.serialize()
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2015 Guillaume Ayoub
|
||||
|
@ -19,35 +19,40 @@
|
|||
|
||||
|
||||
import math
|
||||
import xml.etree.ElementTree as ET
|
||||
from datetime import date, datetime, timedelta, timezone
|
||||
from itertools import chain
|
||||
from typing import (Callable, Iterable, Iterator, List, Optional, Sequence,
|
||||
Tuple)
|
||||
|
||||
from radicale import xmlutils
|
||||
import vobject
|
||||
|
||||
from radicale import item, xmlutils
|
||||
from radicale.log import logger
|
||||
|
||||
DAY = timedelta(days=1)
|
||||
SECOND = timedelta(seconds=1)
|
||||
DATETIME_MIN = datetime.min.replace(tzinfo=timezone.utc)
|
||||
DATETIME_MAX = datetime.max.replace(tzinfo=timezone.utc)
|
||||
TIMESTAMP_MIN = math.floor(DATETIME_MIN.timestamp())
|
||||
TIMESTAMP_MAX = math.ceil(DATETIME_MAX.timestamp())
|
||||
DAY: timedelta = timedelta(days=1)
|
||||
SECOND: timedelta = timedelta(seconds=1)
|
||||
DATETIME_MIN: datetime = datetime.min.replace(tzinfo=timezone.utc)
|
||||
DATETIME_MAX: datetime = datetime.max.replace(tzinfo=timezone.utc)
|
||||
TIMESTAMP_MIN: int = math.floor(DATETIME_MIN.timestamp())
|
||||
TIMESTAMP_MAX: int = math.ceil(DATETIME_MAX.timestamp())
|
||||
|
||||
|
||||
def date_to_datetime(date_):
|
||||
"""Transform a date to a UTC datetime.
|
||||
def date_to_datetime(d: date) -> datetime:
|
||||
"""Transform any date to a UTC datetime.
|
||||
|
||||
If date_ is a datetime without timezone, return as UTC datetime. If date_
|
||||
If ``d`` is a datetime without timezone, return as UTC datetime. If ``d``
|
||||
is already a datetime with timezone, return as is.
|
||||
|
||||
"""
|
||||
if not isinstance(date_, datetime):
|
||||
date_ = datetime.combine(date_, datetime.min.time())
|
||||
if not date_.tzinfo:
|
||||
date_ = date_.replace(tzinfo=timezone.utc)
|
||||
return date_
|
||||
if not isinstance(d, datetime):
|
||||
d = datetime.combine(d, datetime.min.time())
|
||||
if not d.tzinfo:
|
||||
d = d.replace(tzinfo=timezone.utc)
|
||||
return d
|
||||
|
||||
|
||||
def comp_match(item, filter_, level=0):
|
||||
def comp_match(item: "item.Item", filter_: ET.Element, level: int = 0) -> bool:
|
||||
"""Check whether the ``item`` matches the comp ``filter_``.
|
||||
|
||||
If ``level`` is ``0``, the filter is applied on the
|
||||
|
@ -70,7 +75,7 @@ def comp_match(item, filter_, level=0):
|
|||
return True
|
||||
if not tag:
|
||||
return False
|
||||
name = filter_.get("name").upper()
|
||||
name = filter_.get("name", "").upper()
|
||||
if len(filter_) == 0:
|
||||
# Point #1 of rfc4791-9.7.1
|
||||
return name == tag
|
||||
|
@ -104,18 +109,19 @@ def comp_match(item, filter_, level=0):
|
|||
return True
|
||||
|
||||
|
||||
def prop_match(vobject_item, filter_, ns):
|
||||
def prop_match(vobject_item: vobject.base.Component,
|
||||
filter_: ET.Element, ns: str) -> bool:
|
||||
"""Check whether the ``item`` matches the prop ``filter_``.
|
||||
|
||||
See rfc4791-9.7.2 and rfc6352-10.5.1.
|
||||
|
||||
"""
|
||||
name = filter_.get("name").lower()
|
||||
name = filter_.get("name", "").lower()
|
||||
if len(filter_) == 0:
|
||||
# Point #1 of rfc4791-9.7.2
|
||||
return name in vobject_item.contents
|
||||
if len(filter_) == 1:
|
||||
if filter_[0].tag == xmlutils.make_clark("C:is-not-defined"):
|
||||
if filter_[0].tag == xmlutils.make_clark("%s:is-not-defined" % ns):
|
||||
# Point #2 of rfc4791-9.7.2
|
||||
return name not in vobject_item.contents
|
||||
if name not in vobject_item.contents:
|
||||
|
@ -136,20 +142,21 @@ def prop_match(vobject_item, filter_, ns):
|
|||
return True
|
||||
|
||||
|
||||
def time_range_match(vobject_item, filter_, child_name):
|
||||
def time_range_match(vobject_item: vobject.base.Component,
|
||||
filter_: ET.Element, child_name: str) -> bool:
|
||||
"""Check whether the component/property ``child_name`` of
|
||||
``vobject_item`` matches the time-range ``filter_``."""
|
||||
|
||||
start = filter_.get("start")
|
||||
end = filter_.get("end")
|
||||
if not start and not end:
|
||||
start_text = filter_.get("start")
|
||||
end_text = filter_.get("end")
|
||||
if not start_text and not end_text:
|
||||
return False
|
||||
if start:
|
||||
start = datetime.strptime(start, "%Y%m%dT%H%M%SZ")
|
||||
if start_text:
|
||||
start = datetime.strptime(start_text, "%Y%m%dT%H%M%SZ")
|
||||
else:
|
||||
start = datetime.min
|
||||
if end:
|
||||
end = datetime.strptime(end, "%Y%m%dT%H%M%SZ")
|
||||
if end_text:
|
||||
end = datetime.strptime(end_text, "%Y%m%dT%H%M%SZ")
|
||||
else:
|
||||
end = datetime.max
|
||||
start = start.replace(tzinfo=timezone.utc)
|
||||
|
@ -157,7 +164,8 @@ def time_range_match(vobject_item, filter_, child_name):
|
|||
|
||||
matched = False
|
||||
|
||||
def range_fn(range_start, range_end, is_recurrence):
|
||||
def range_fn(range_start: datetime, range_end: datetime,
|
||||
is_recurrence: bool) -> bool:
|
||||
nonlocal matched
|
||||
if start < range_end and range_start < end:
|
||||
matched = True
|
||||
|
@ -166,14 +174,16 @@ def time_range_match(vobject_item, filter_, child_name):
|
|||
return True
|
||||
return False
|
||||
|
||||
def infinity_fn(start):
|
||||
def infinity_fn(start: datetime) -> bool:
|
||||
return False
|
||||
|
||||
visit_time_ranges(vobject_item, child_name, range_fn, infinity_fn)
|
||||
return matched
|
||||
|
||||
|
||||
def visit_time_ranges(vobject_item, child_name, range_fn, infinity_fn):
|
||||
def visit_time_ranges(vobject_item: vobject.base.Component, child_name: str,
|
||||
range_fn: Callable[[datetime, datetime, bool], bool],
|
||||
infinity_fn: Callable[[datetime], bool]) -> None:
|
||||
"""Visit all time ranges in the component/property ``child_name`` of
|
||||
`vobject_item`` with visitors ``range_fn`` and ``infinity_fn``.
|
||||
|
||||
|
@ -181,7 +191,7 @@ def visit_time_ranges(vobject_item, child_name, range_fn, infinity_fn):
|
|||
datetimes and ``is_recurrence`` as arguments. If the function returns True,
|
||||
the operation is cancelled.
|
||||
|
||||
``infinity_fn`` gets called when an infiite recurrence rule is detected
|
||||
``infinity_fn`` gets called when an infinite recurrence rule is detected
|
||||
with ``start`` datetime as argument. If the function returns True, the
|
||||
operation is cancelled.
|
||||
|
||||
|
@ -194,10 +204,15 @@ def visit_time_ranges(vobject_item, child_name, range_fn, infinity_fn):
|
|||
# recurrences too. This is not respected and client don't seem to bother
|
||||
# either.
|
||||
|
||||
def getrruleset(child, ignore=()):
|
||||
if (hasattr(child, "rrule") and
|
||||
";UNTIL=" not in child.rrule.value.upper() and
|
||||
";COUNT=" not in child.rrule.value.upper()):
|
||||
def getrruleset(child: vobject.base.Component, ignore: Sequence[date]
|
||||
) -> Tuple[Iterable[date], bool]:
|
||||
infinite = False
|
||||
for rrule in child.contents.get("rrule", []):
|
||||
if (";UNTIL=" not in rrule.value.upper() and
|
||||
";COUNT=" not in rrule.value.upper()):
|
||||
infinite = True
|
||||
break
|
||||
if infinite:
|
||||
for dtstart in child.getrruleset(addRDate=True):
|
||||
if dtstart in ignore:
|
||||
continue
|
||||
|
@ -207,7 +222,8 @@ def visit_time_ranges(vobject_item, child_name, range_fn, infinity_fn):
|
|||
return filter(lambda dtstart: dtstart not in ignore,
|
||||
child.getrruleset(addRDate=True)), False
|
||||
|
||||
def get_children(components):
|
||||
def get_children(components: Iterable[vobject.base.Component]) -> Iterator[
|
||||
Tuple[vobject.base.Component, bool, List[date]]]:
|
||||
main = None
|
||||
recurrences = []
|
||||
for comp in components:
|
||||
|
@ -216,7 +232,7 @@ def visit_time_ranges(vobject_item, child_name, range_fn, infinity_fn):
|
|||
if comp.rruleset:
|
||||
# Prevent possible infinite loop
|
||||
raise ValueError("Overwritten recurrence with RRULESET")
|
||||
yield comp, True, ()
|
||||
yield comp, True, []
|
||||
else:
|
||||
if main is not None:
|
||||
raise ValueError("Multiple main components")
|
||||
|
@ -410,12 +426,17 @@ def visit_time_ranges(vobject_item, child_name, range_fn, infinity_fn):
|
|||
# Match a property
|
||||
child = getattr(vobject_item, child_name.lower())
|
||||
if isinstance(child, date):
|
||||
range_fn(child, child + DAY, False)
|
||||
elif isinstance(child, datetime):
|
||||
range_fn(child, child + SECOND, False)
|
||||
child_is_datetime = isinstance(child, datetime)
|
||||
child = date_to_datetime(child)
|
||||
if child_is_datetime:
|
||||
range_fn(child, child + SECOND, False)
|
||||
else:
|
||||
range_fn(child, child + DAY, False)
|
||||
|
||||
|
||||
def text_match(vobject_item, filter_, child_name, ns, attrib_name=None):
|
||||
def text_match(vobject_item: vobject.base.Component,
|
||||
filter_: ET.Element, child_name: str, ns: str,
|
||||
attrib_name: Optional[str] = None) -> bool:
|
||||
"""Check whether the ``item`` matches the text-match ``filter_``.
|
||||
|
||||
See rfc4791-9.7.5.
|
||||
|
@ -429,7 +450,7 @@ def text_match(vobject_item, filter_, child_name, ns, attrib_name=None):
|
|||
if ns == "CR":
|
||||
match_type = filter_.get("match-type", match_type)
|
||||
|
||||
def match(value):
|
||||
def match(value: str) -> bool:
|
||||
value = value.lower()
|
||||
if match_type == "equals":
|
||||
return value == text
|
||||
|
@ -442,7 +463,7 @@ def text_match(vobject_item, filter_, child_name, ns, attrib_name=None):
|
|||
raise ValueError("Unexpected text-match match-type: %r" % match_type)
|
||||
|
||||
children = getattr(vobject_item, "%s_list" % child_name, [])
|
||||
if attrib_name:
|
||||
if attrib_name is not None:
|
||||
condition = any(
|
||||
match(attrib) for child in children
|
||||
for attrib in child.params.get(attrib_name, []))
|
||||
|
@ -453,13 +474,14 @@ def text_match(vobject_item, filter_, child_name, ns, attrib_name=None):
|
|||
return condition
|
||||
|
||||
|
||||
def param_filter_match(vobject_item, filter_, parent_name, ns):
|
||||
def param_filter_match(vobject_item: vobject.base.Component,
|
||||
filter_: ET.Element, parent_name: str, ns: str) -> bool:
|
||||
"""Check whether the ``item`` matches the param-filter ``filter_``.
|
||||
|
||||
See rfc4791-9.7.3.
|
||||
|
||||
"""
|
||||
name = filter_.get("name").upper()
|
||||
name = filter_.get("name", "").upper()
|
||||
children = getattr(vobject_item, "%s_list" % parent_name, [])
|
||||
condition = any(name in child.params for child in children)
|
||||
if len(filter_) > 0:
|
||||
|
@ -471,7 +493,8 @@ def param_filter_match(vobject_item, filter_, parent_name, ns):
|
|||
return condition
|
||||
|
||||
|
||||
def simplify_prefilters(filters, collection_tag="VCALENDAR"):
|
||||
def simplify_prefilters(filters: Iterable[ET.Element], collection_tag: str
|
||||
) -> Tuple[Optional[str], int, int, bool]:
|
||||
"""Creates a simplified condition from ``filters``.
|
||||
|
||||
Returns a tuple (``tag``, ``start``, ``end``, ``simple``) where ``tag`` is
|
||||
|
@ -480,14 +503,14 @@ def simplify_prefilters(filters, collection_tag="VCALENDAR"):
|
|||
and the simplified condition are identical.
|
||||
|
||||
"""
|
||||
flat_filters = tuple(chain.from_iterable(filters))
|
||||
flat_filters = list(chain.from_iterable(filters))
|
||||
simple = len(flat_filters) <= 1
|
||||
for col_filter in flat_filters:
|
||||
if collection_tag != "VCALENDAR":
|
||||
simple = False
|
||||
break
|
||||
if (col_filter.tag != xmlutils.make_clark("C:comp-filter") or
|
||||
col_filter.get("name").upper() != "VCALENDAR"):
|
||||
col_filter.get("name", "").upper() != "VCALENDAR"):
|
||||
simple = False
|
||||
continue
|
||||
simple &= len(col_filter) <= 1
|
||||
|
@ -495,7 +518,7 @@ def simplify_prefilters(filters, collection_tag="VCALENDAR"):
|
|||
if comp_filter.tag != xmlutils.make_clark("C:comp-filter"):
|
||||
simple = False
|
||||
continue
|
||||
tag = comp_filter.get("name").upper()
|
||||
tag = comp_filter.get("name", "").upper()
|
||||
if comp_filter.find(
|
||||
xmlutils.make_clark("C:is-not-defined")) is not None:
|
||||
simple = False
|
||||
|
@ -508,17 +531,17 @@ def simplify_prefilters(filters, collection_tag="VCALENDAR"):
|
|||
if time_filter.tag != xmlutils.make_clark("C:time-range"):
|
||||
simple = False
|
||||
continue
|
||||
start = time_filter.get("start")
|
||||
end = time_filter.get("end")
|
||||
if start:
|
||||
start_text = time_filter.get("start")
|
||||
end_text = time_filter.get("end")
|
||||
if start_text:
|
||||
start = math.floor(datetime.strptime(
|
||||
start, "%Y%m%dT%H%M%SZ").replace(
|
||||
start_text, "%Y%m%dT%H%M%SZ").replace(
|
||||
tzinfo=timezone.utc).timestamp())
|
||||
else:
|
||||
start = TIMESTAMP_MIN
|
||||
if end:
|
||||
if end_text:
|
||||
end = math.ceil(datetime.strptime(
|
||||
end, "%Y%m%dT%H%M%SZ").replace(
|
||||
end_text, "%Y%m%dT%H%M%SZ").replace(
|
||||
tzinfo=timezone.utc).timestamp())
|
||||
else:
|
||||
end = TIMESTAMP_MAX
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2011-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2019 Unrud <unrud@outlook.com>
|
||||
#
|
||||
|
@ -25,42 +25,46 @@ Log messages are sent to the first available target of:
|
|||
|
||||
"""
|
||||
|
||||
import contextlib
|
||||
import logging
|
||||
import os
|
||||
import sys
|
||||
import threading
|
||||
from typing import Any, Callable, ClassVar, Dict, Iterator, Union
|
||||
|
||||
LOGGER_NAME = "radicale"
|
||||
LOGGER_FORMAT = "[%(asctime)s] [%(ident)s] [%(levelname)s] %(message)s"
|
||||
DATE_FORMAT = "%Y-%m-%d %H:%M:%S %z"
|
||||
from radicale import types
|
||||
|
||||
logger = logging.getLogger(LOGGER_NAME)
|
||||
LOGGER_NAME: str = "radicale"
|
||||
LOGGER_FORMAT: str = "[%(asctime)s] [%(ident)s] [%(levelname)s] %(message)s"
|
||||
DATE_FORMAT: str = "%Y-%m-%d %H:%M:%S %z"
|
||||
|
||||
logger: logging.Logger = logging.getLogger(LOGGER_NAME)
|
||||
|
||||
|
||||
class RemoveTracebackFilter(logging.Filter):
|
||||
def filter(self, record):
|
||||
|
||||
def filter(self, record: logging.LogRecord) -> bool:
|
||||
record.exc_info = None
|
||||
return True
|
||||
|
||||
|
||||
REMOVE_TRACEBACK_FILTER = RemoveTracebackFilter()
|
||||
REMOVE_TRACEBACK_FILTER: logging.Filter = RemoveTracebackFilter()
|
||||
|
||||
|
||||
class IdentLogRecordFactory:
|
||||
"""LogRecordFactory that adds ``ident`` attribute."""
|
||||
|
||||
def __init__(self, upstream_factory):
|
||||
self.upstream_factory = upstream_factory
|
||||
def __init__(self, upstream_factory: Callable[..., logging.LogRecord]
|
||||
) -> None:
|
||||
self._upstream_factory = upstream_factory
|
||||
|
||||
def __call__(self, *args, **kwargs):
|
||||
record = self.upstream_factory(*args, **kwargs)
|
||||
def __call__(self, *args: Any, **kwargs: Any) -> logging.LogRecord:
|
||||
record = self._upstream_factory(*args, **kwargs)
|
||||
ident = "%d" % os.getpid()
|
||||
main_thread = threading.main_thread()
|
||||
current_thread = threading.current_thread()
|
||||
if current_thread.name and main_thread != current_thread:
|
||||
ident += "/%s" % current_thread.name
|
||||
record.ident = ident
|
||||
record.ident = ident # type:ignore[attr-defined]
|
||||
return record
|
||||
|
||||
|
||||
|
@ -68,13 +72,15 @@ class ThreadedStreamHandler(logging.Handler):
|
|||
"""Sends logging output to the stream registered for the current thread or
|
||||
``sys.stderr`` when no stream was registered."""
|
||||
|
||||
terminator = "\n"
|
||||
terminator: ClassVar[str] = "\n"
|
||||
|
||||
def __init__(self):
|
||||
_streams: Dict[int, types.ErrorStream]
|
||||
|
||||
def __init__(self) -> None:
|
||||
super().__init__()
|
||||
self._streams = {}
|
||||
|
||||
def emit(self, record):
|
||||
def emit(self, record: logging.LogRecord) -> None:
|
||||
try:
|
||||
stream = self._streams.get(threading.get_ident(), sys.stderr)
|
||||
msg = self.format(record)
|
||||
|
@ -85,8 +91,8 @@ class ThreadedStreamHandler(logging.Handler):
|
|||
except Exception:
|
||||
self.handleError(record)
|
||||
|
||||
@contextlib.contextmanager
|
||||
def register_stream(self, stream):
|
||||
@types.contextmanager
|
||||
def register_stream(self, stream: types.ErrorStream) -> Iterator[None]:
|
||||
"""Register stream for logging output of the current thread."""
|
||||
key = threading.get_ident()
|
||||
self._streams[key] = stream
|
||||
|
@ -96,13 +102,13 @@ class ThreadedStreamHandler(logging.Handler):
|
|||
del self._streams[key]
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def register_stream(stream):
|
||||
@types.contextmanager
|
||||
def register_stream(stream: types.ErrorStream) -> Iterator[None]:
|
||||
"""Register stream for logging output of the current thread."""
|
||||
yield
|
||||
|
||||
|
||||
def setup():
|
||||
def setup() -> None:
|
||||
"""Set global logging up."""
|
||||
global register_stream
|
||||
handler = ThreadedStreamHandler()
|
||||
|
@ -114,12 +120,12 @@ def setup():
|
|||
set_level(logging.WARNING)
|
||||
|
||||
|
||||
def set_level(level):
|
||||
def set_level(level: Union[int, str]) -> None:
|
||||
"""Set logging level for global logger."""
|
||||
if isinstance(level, str):
|
||||
level = getattr(logging, level.upper())
|
||||
assert isinstance(level, int)
|
||||
logger.setLevel(level)
|
||||
if level == logging.DEBUG:
|
||||
logger.removeFilter(REMOVE_TRACEBACK_FILTER)
|
||||
else:
|
||||
logger.removeFilter(REMOVE_TRACEBACK_FILTER)
|
||||
if level > logging.DEBUG:
|
||||
logger.addFilter(REMOVE_TRACEBACK_FILTER)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
|
@ -21,17 +21,23 @@ Helper functions for working with the file system.
|
|||
|
||||
"""
|
||||
|
||||
import contextlib
|
||||
import errno
|
||||
import os
|
||||
import posixpath
|
||||
import sys
|
||||
import threading
|
||||
from tempfile import TemporaryDirectory
|
||||
from typing import Iterator, Type, Union
|
||||
|
||||
if os.name == "nt":
|
||||
from radicale import storage, types
|
||||
|
||||
if sys.platform == "win32":
|
||||
import ctypes
|
||||
import ctypes.wintypes
|
||||
import msvcrt
|
||||
|
||||
LOCKFILE_EXCLUSIVE_LOCK = 2
|
||||
LOCKFILE_EXCLUSIVE_LOCK: int = 2
|
||||
ULONG_PTR: Union[Type[ctypes.c_uint32], Type[ctypes.c_uint64]]
|
||||
if ctypes.sizeof(ctypes.c_void_p) == 4:
|
||||
ULONG_PTR = ctypes.c_uint32
|
||||
else:
|
||||
|
@ -45,7 +51,8 @@ if os.name == "nt":
|
|||
("offset_high", ctypes.wintypes.DWORD),
|
||||
("h_event", ctypes.wintypes.HANDLE)]
|
||||
|
||||
lock_file_ex = ctypes.windll.kernel32.LockFileEx
|
||||
kernel32 = ctypes.WinDLL("kernel32", use_last_error=True)
|
||||
lock_file_ex = kernel32.LockFileEx
|
||||
lock_file_ex.argtypes = [
|
||||
ctypes.wintypes.HANDLE,
|
||||
ctypes.wintypes.DWORD,
|
||||
|
@ -54,7 +61,7 @@ if os.name == "nt":
|
|||
ctypes.wintypes.DWORD,
|
||||
ctypes.POINTER(Overlapped)]
|
||||
lock_file_ex.restype = ctypes.wintypes.BOOL
|
||||
unlock_file_ex = ctypes.windll.kernel32.UnlockFileEx
|
||||
unlock_file_ex = kernel32.UnlockFileEx
|
||||
unlock_file_ex.argtypes = [
|
||||
ctypes.wintypes.HANDLE,
|
||||
ctypes.wintypes.DWORD,
|
||||
|
@ -62,21 +69,46 @@ if os.name == "nt":
|
|||
ctypes.wintypes.DWORD,
|
||||
ctypes.POINTER(Overlapped)]
|
||||
unlock_file_ex.restype = ctypes.wintypes.BOOL
|
||||
elif os.name == "posix":
|
||||
else:
|
||||
import fcntl
|
||||
|
||||
if sys.platform == "linux":
|
||||
import ctypes
|
||||
|
||||
RENAME_EXCHANGE: int = 2
|
||||
renameat2 = None
|
||||
try:
|
||||
renameat2 = ctypes.CDLL(None, use_errno=True).renameat2
|
||||
except AttributeError:
|
||||
pass
|
||||
else:
|
||||
renameat2.argtypes = [
|
||||
ctypes.c_int, ctypes.c_char_p,
|
||||
ctypes.c_int, ctypes.c_char_p,
|
||||
ctypes.c_uint]
|
||||
renameat2.restype = ctypes.c_int
|
||||
|
||||
if sys.platform == "darwin":
|
||||
# Definition missing in PyPy
|
||||
F_FULLFSYNC: int = getattr(fcntl, "F_FULLFSYNC", 51)
|
||||
|
||||
|
||||
class RwLock:
|
||||
"""A readers-Writer lock that locks a file."""
|
||||
|
||||
def __init__(self, path):
|
||||
_path: str
|
||||
_readers: int
|
||||
_writer: bool
|
||||
_lock: threading.Lock
|
||||
|
||||
def __init__(self, path: str) -> None:
|
||||
self._path = path
|
||||
self._readers = 0
|
||||
self._writer = False
|
||||
self._lock = threading.Lock()
|
||||
|
||||
@property
|
||||
def locked(self):
|
||||
def locked(self) -> str:
|
||||
with self._lock:
|
||||
if self._readers > 0:
|
||||
return "r"
|
||||
|
@ -84,28 +116,28 @@ class RwLock:
|
|||
return "w"
|
||||
return ""
|
||||
|
||||
@contextlib.contextmanager
|
||||
def acquire(self, mode):
|
||||
@types.contextmanager
|
||||
def acquire(self, mode: str) -> Iterator[None]:
|
||||
if mode not in "rw":
|
||||
raise ValueError("Invalid mode: %r" % mode)
|
||||
with open(self._path, "w+") as lock_file:
|
||||
if os.name == "nt":
|
||||
if sys.platform == "win32":
|
||||
handle = msvcrt.get_osfhandle(lock_file.fileno())
|
||||
flags = LOCKFILE_EXCLUSIVE_LOCK if mode == "w" else 0
|
||||
overlapped = Overlapped()
|
||||
if not lock_file_ex(handle, flags, 0, 1, 0, overlapped):
|
||||
raise RuntimeError("Locking the storage failed: %s" %
|
||||
ctypes.FormatError())
|
||||
elif os.name == "posix":
|
||||
try:
|
||||
if not lock_file_ex(handle, flags, 0, 1, 0, overlapped):
|
||||
raise ctypes.WinError()
|
||||
except OSError as e:
|
||||
raise RuntimeError("Locking the storage failed: %s" % e
|
||||
) from e
|
||||
else:
|
||||
_cmd = fcntl.LOCK_EX if mode == "w" else fcntl.LOCK_SH
|
||||
try:
|
||||
fcntl.flock(lock_file.fileno(), _cmd)
|
||||
except OSError as e:
|
||||
raise RuntimeError("Locking the storage failed: %s" %
|
||||
e) from e
|
||||
else:
|
||||
raise RuntimeError("Locking the storage failed: "
|
||||
"Unsupported operating system")
|
||||
raise RuntimeError("Locking the storage failed: %s" % e
|
||||
) from e
|
||||
with self._lock:
|
||||
if self._writer or mode == "w" and self._readers != 0:
|
||||
raise RuntimeError("Locking the storage failed: "
|
||||
|
@ -123,19 +155,65 @@ class RwLock:
|
|||
self._writer = False
|
||||
|
||||
|
||||
def fsync(fd):
|
||||
if os.name == "posix" and hasattr(fcntl, "F_FULLFSYNC"):
|
||||
fcntl.fcntl(fd, fcntl.F_FULLFSYNC)
|
||||
else:
|
||||
os.fsync(fd)
|
||||
def rename_exchange(src: str, dst: str) -> None:
|
||||
"""Exchange the files or directories `src` and `dst`.
|
||||
|
||||
Both `src` and `dst` must exist but may be of different types.
|
||||
|
||||
On Linux with renameat2 the operation is atomic.
|
||||
On other platforms it's not atomic.
|
||||
|
||||
"""
|
||||
src_dir, src_base = os.path.split(src)
|
||||
dst_dir, dst_base = os.path.split(dst)
|
||||
src_dir = src_dir or os.curdir
|
||||
dst_dir = dst_dir or os.curdir
|
||||
if not src_base or not dst_base:
|
||||
raise ValueError("Invalid arguments: %r -> %r" % (src, dst))
|
||||
if sys.platform == "linux" and renameat2:
|
||||
src_base_bytes = os.fsencode(src_base)
|
||||
dst_base_bytes = os.fsencode(dst_base)
|
||||
src_dir_fd = os.open(src_dir, 0)
|
||||
try:
|
||||
dst_dir_fd = os.open(dst_dir, 0)
|
||||
try:
|
||||
if renameat2(src_dir_fd, src_base_bytes,
|
||||
dst_dir_fd, dst_base_bytes,
|
||||
RENAME_EXCHANGE) == 0:
|
||||
return
|
||||
errno_ = ctypes.get_errno()
|
||||
# Fallback if RENAME_EXCHANGE not supported by filesystem
|
||||
if errno_ != errno.EINVAL:
|
||||
raise OSError(errno_, os.strerror(errno_))
|
||||
finally:
|
||||
os.close(dst_dir_fd)
|
||||
finally:
|
||||
os.close(src_dir_fd)
|
||||
with TemporaryDirectory(prefix=".Radicale.tmp-", dir=src_dir
|
||||
) as tmp_dir:
|
||||
os.rename(dst, os.path.join(tmp_dir, "interim"))
|
||||
os.rename(src, dst)
|
||||
os.rename(os.path.join(tmp_dir, "interim"), src)
|
||||
|
||||
|
||||
def strip_path(path):
|
||||
def fsync(fd: int) -> None:
|
||||
if sys.platform == "darwin":
|
||||
try:
|
||||
fcntl.fcntl(fd, F_FULLFSYNC)
|
||||
return
|
||||
except OSError as e:
|
||||
# Fallback if F_FULLFSYNC not supported by filesystem
|
||||
if e.errno != errno.EINVAL:
|
||||
raise
|
||||
os.fsync(fd)
|
||||
|
||||
|
||||
def strip_path(path: str) -> str:
|
||||
assert sanitize_path(path) == path
|
||||
return path.strip("/")
|
||||
|
||||
|
||||
def unstrip_path(stripped_path, trailing_slash=False):
|
||||
def unstrip_path(stripped_path: str, trailing_slash: bool = False) -> str:
|
||||
assert strip_path(sanitize_path(stripped_path)) == stripped_path
|
||||
assert stripped_path or trailing_slash
|
||||
path = "/%s" % stripped_path
|
||||
|
@ -144,7 +222,7 @@ def unstrip_path(stripped_path, trailing_slash=False):
|
|||
return path
|
||||
|
||||
|
||||
def sanitize_path(path):
|
||||
def sanitize_path(path: str) -> str:
|
||||
"""Make path absolute with leading slash to prevent access to other data.
|
||||
|
||||
Preserve potential trailing slash.
|
||||
|
@ -161,16 +239,16 @@ def sanitize_path(path):
|
|||
return new_path + trailing_slash
|
||||
|
||||
|
||||
def is_safe_path_component(path):
|
||||
def is_safe_path_component(path: str) -> bool:
|
||||
"""Check if path is a single component of a path.
|
||||
|
||||
Check that the path is safe to join too.
|
||||
|
||||
"""
|
||||
return path and "/" not in path and path not in (".", "..")
|
||||
return bool(path) and "/" not in path and path not in (".", "..")
|
||||
|
||||
|
||||
def is_safe_filesystem_path_component(path):
|
||||
def is_safe_filesystem_path_component(path: str) -> bool:
|
||||
"""Check if path is a single component of a local and posix filesystem
|
||||
path.
|
||||
|
||||
|
@ -178,13 +256,14 @@ def is_safe_filesystem_path_component(path):
|
|||
|
||||
"""
|
||||
return (
|
||||
path and not os.path.splitdrive(path)[0] and
|
||||
bool(path) and not os.path.splitdrive(path)[0] and
|
||||
(sys.platform != "win32" or ":" not in path) and # Block NTFS-ADS
|
||||
not os.path.split(path)[0] and path not in (os.curdir, os.pardir) and
|
||||
not path.startswith(".") and not path.endswith("~") and
|
||||
is_safe_path_component(path))
|
||||
|
||||
|
||||
def path_to_filesystem(root, sane_path):
|
||||
def path_to_filesystem(root: str, sane_path: str) -> str:
|
||||
"""Convert `sane_path` to a local filesystem path relative to `root`.
|
||||
|
||||
`root` must be a secure filesystem path, it will be prepend to the path.
|
||||
|
@ -206,25 +285,25 @@ def path_to_filesystem(root, sane_path):
|
|||
# Check for conflicting files (e.g. case-insensitive file systems
|
||||
# or short names on Windows file systems)
|
||||
if (os.path.lexists(safe_path) and
|
||||
part not in (e.name for e in
|
||||
os.scandir(safe_path_parent))):
|
||||
part not in (e.name for e in os.scandir(safe_path_parent))):
|
||||
raise CollidingPathError(part)
|
||||
return safe_path
|
||||
|
||||
|
||||
class UnsafePathError(ValueError):
|
||||
def __init__(self, path):
|
||||
message = "Can't translate name safely to filesystem: %r" % path
|
||||
super().__init__(message)
|
||||
|
||||
def __init__(self, path: str) -> None:
|
||||
super().__init__("Can't translate name safely to filesystem: %r" %
|
||||
path)
|
||||
|
||||
|
||||
class CollidingPathError(ValueError):
|
||||
def __init__(self, path):
|
||||
message = "File name collision: %r" % path
|
||||
super().__init__(message)
|
||||
|
||||
def __init__(self, path: str) -> None:
|
||||
super().__init__("File name collision: %r" % path)
|
||||
|
||||
|
||||
def name_from_path(path, collection):
|
||||
def name_from_path(path: str, collection: "storage.BaseCollection") -> str:
|
||||
"""Return Radicale item name from ``path``."""
|
||||
assert sanitize_path(path) == path
|
||||
start = unstrip_path(collection.path, True)
|
||||
|
|
0
radicale/py.typed
Normal file
0
radicale/py.typed
Normal file
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
#
|
||||
|
@ -32,17 +32,21 @@ Take a look at the class ``BaseRights`` if you want to implement your own.
|
|||
|
||||
"""
|
||||
|
||||
from radicale import utils
|
||||
from typing import Sequence
|
||||
|
||||
INTERNAL_TYPES = ("authenticated", "owner_write", "owner_only", "from_file")
|
||||
from radicale import config, utils
|
||||
|
||||
INTERNAL_TYPES: Sequence[str] = ("authenticated", "owner_write", "owner_only",
|
||||
"from_file")
|
||||
|
||||
|
||||
def load(configuration):
|
||||
def load(configuration: "config.Configuration") -> "BaseRights":
|
||||
"""Load the rights module chosen in configuration."""
|
||||
return utils.load_plugin(INTERNAL_TYPES, "rights", "Rights", configuration)
|
||||
return utils.load_plugin(INTERNAL_TYPES, "rights", "Rights", BaseRights,
|
||||
configuration)
|
||||
|
||||
|
||||
def intersect(a, b):
|
||||
def intersect(a: str, b: str) -> str:
|
||||
"""Intersect two lists of rights.
|
||||
|
||||
Returns all rights that are both in ``a`` and ``b``.
|
||||
|
@ -52,7 +56,8 @@ def intersect(a, b):
|
|||
|
||||
|
||||
class BaseRights:
|
||||
def __init__(self, configuration):
|
||||
|
||||
def __init__(self, configuration: "config.Configuration") -> None:
|
||||
"""Initialize BaseRights.
|
||||
|
||||
``configuration`` see ``radicale.config`` module.
|
||||
|
@ -62,7 +67,7 @@ class BaseRights:
|
|||
"""
|
||||
self.configuration = configuration
|
||||
|
||||
def authorization(self, user, path):
|
||||
def authorization(self, user: str, path: str) -> str:
|
||||
"""Get granted rights of ``user`` for the collection ``path``.
|
||||
|
||||
If ``user`` is empty, check for anonymous rights.
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
#
|
||||
|
@ -21,15 +21,16 @@ calendars and address books.
|
|||
|
||||
"""
|
||||
|
||||
from radicale import pathutils, rights
|
||||
from radicale import config, pathutils, rights
|
||||
|
||||
|
||||
class Rights(rights.BaseRights):
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def __init__(self, configuration: config.Configuration) -> None:
|
||||
super().__init__(configuration)
|
||||
self._verify_user = self.configuration.get("auth", "type") != "none"
|
||||
|
||||
def authorization(self, user, path):
|
||||
def authorization(self, user: str, path: str) -> str:
|
||||
if self._verify_user and not user:
|
||||
return ""
|
||||
sane_path = pathutils.strip_path(path)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2019 Unrud <unrud@outlook.com>
|
||||
#
|
||||
|
@ -37,25 +37,27 @@ Leading or ending slashes are trimmed from collection's path.
|
|||
import configparser
|
||||
import re
|
||||
|
||||
from radicale import pathutils, rights
|
||||
from radicale import config, pathutils, rights
|
||||
from radicale.log import logger
|
||||
|
||||
|
||||
class Rights(rights.BaseRights):
|
||||
def __init__(self, configuration):
|
||||
|
||||
_filename: str
|
||||
|
||||
def __init__(self, configuration: config.Configuration) -> None:
|
||||
super().__init__(configuration)
|
||||
self._filename = configuration.get("rights", "file")
|
||||
|
||||
def authorization(self, user, path):
|
||||
def authorization(self, user: str, path: str) -> str:
|
||||
user = user or ""
|
||||
sane_path = pathutils.strip_path(path)
|
||||
# Prevent "regex injection"
|
||||
escaped_user = re.escape(user)
|
||||
rights_config = configparser.ConfigParser()
|
||||
try:
|
||||
if not rights_config.read(self._filename):
|
||||
raise RuntimeError("No such file: %r" %
|
||||
self._filename)
|
||||
with open(self._filename, "r") as f:
|
||||
rights_config.read_file(f)
|
||||
except Exception as e:
|
||||
raise RuntimeError("Failed to load rights file %r: %s" %
|
||||
(self._filename, e)) from e
|
||||
|
@ -67,7 +69,7 @@ class Rights(rights.BaseRights):
|
|||
user_match = re.fullmatch(user_pattern.format(), user)
|
||||
collection_match = user_match and re.fullmatch(
|
||||
collection_pattern.format(
|
||||
*map(re.escape, user_match.groups()),
|
||||
*(re.escape(s) for s in user_match.groups()),
|
||||
user=escaped_user), sane_path)
|
||||
except Exception as e:
|
||||
raise RuntimeError("Error in section %r of rights file %r: "
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
#
|
||||
|
@ -26,7 +26,8 @@ from radicale import pathutils
|
|||
|
||||
|
||||
class Rights(authenticated.Rights):
|
||||
def authorization(self, user, path):
|
||||
|
||||
def authorization(self, user: str, path: str) -> str:
|
||||
if self._verify_user and not user:
|
||||
return ""
|
||||
sane_path = pathutils.strip_path(path)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
#
|
||||
|
@ -26,7 +26,8 @@ from radicale import pathutils
|
|||
|
||||
|
||||
class Rights(authenticated.Rights):
|
||||
def authorization(self, user, path):
|
||||
|
||||
def authorization(self, user: str, path: str) -> str:
|
||||
if self._verify_user and not user:
|
||||
return ""
|
||||
sane_path = pathutils.strip_path(path)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -23,81 +23,129 @@ Built-in WSGI server.
|
|||
"""
|
||||
|
||||
import errno
|
||||
import os
|
||||
import http
|
||||
import select
|
||||
import socket
|
||||
import socketserver
|
||||
import ssl
|
||||
import sys
|
||||
import wsgiref.simple_server
|
||||
from typing import (Any, Callable, Dict, List, MutableMapping, Optional, Set,
|
||||
Tuple, Union)
|
||||
from urllib.parse import unquote
|
||||
|
||||
from radicale import Application, config
|
||||
from radicale.log import logger
|
||||
|
||||
COMPAT_EAI_ADDRFAMILY: int
|
||||
if hasattr(socket, "EAI_ADDRFAMILY"):
|
||||
COMPAT_EAI_ADDRFAMILY = socket.EAI_ADDRFAMILY
|
||||
COMPAT_EAI_ADDRFAMILY = socket.EAI_ADDRFAMILY # type:ignore[attr-defined]
|
||||
elif hasattr(socket, "EAI_NONAME"):
|
||||
# Windows and BSD don't have a special error code for this
|
||||
COMPAT_EAI_ADDRFAMILY = socket.EAI_NONAME
|
||||
COMPAT_EAI_NODATA: int
|
||||
if hasattr(socket, "EAI_NODATA"):
|
||||
COMPAT_EAI_NODATA = socket.EAI_NODATA
|
||||
elif hasattr(socket, "EAI_NONAME"):
|
||||
# Windows and BSD don't have a special error code for this
|
||||
COMPAT_EAI_NODATA = socket.EAI_NONAME
|
||||
COMPAT_IPPROTO_IPV6: int
|
||||
if hasattr(socket, "IPPROTO_IPV6"):
|
||||
COMPAT_IPPROTO_IPV6 = socket.IPPROTO_IPV6
|
||||
elif os.name == "nt":
|
||||
# Workaround: https://bugs.python.org/issue29515
|
||||
elif sys.platform == "win32":
|
||||
# HACK: https://bugs.python.org/issue29515
|
||||
COMPAT_IPPROTO_IPV6 = 41
|
||||
|
||||
|
||||
def format_address(address):
|
||||
# IPv4 (host, port) and IPv6 (host, port, flowinfo, scopeid)
|
||||
ADDRESS_TYPE = Union[Tuple[str, int], Tuple[str, int, int, int]]
|
||||
|
||||
|
||||
def format_address(address: ADDRESS_TYPE) -> str:
|
||||
return "[%s]:%d" % address[:2]
|
||||
|
||||
|
||||
class ParallelHTTPServer(socketserver.ThreadingMixIn,
|
||||
wsgiref.simple_server.WSGIServer):
|
||||
|
||||
# We wait for child threads ourself
|
||||
block_on_close = False
|
||||
configuration: config.Configuration
|
||||
worker_sockets: Set[socket.socket]
|
||||
_timeout: float
|
||||
|
||||
def __init__(self, configuration, family, address, RequestHandlerClass):
|
||||
# We wait for child threads ourself (ThreadingMixIn)
|
||||
block_on_close: bool = False
|
||||
daemon_threads: bool = True
|
||||
|
||||
def __init__(self, configuration: config.Configuration, family: int,
|
||||
address: Tuple[str, int], RequestHandlerClass:
|
||||
Callable[..., http.server.BaseHTTPRequestHandler]) -> None:
|
||||
self.configuration = configuration
|
||||
self.address_family = family
|
||||
super().__init__(address, RequestHandlerClass)
|
||||
self.client_sockets = set()
|
||||
self.worker_sockets = set()
|
||||
self._timeout = configuration.get("server", "timeout")
|
||||
|
||||
def server_bind(self):
|
||||
def server_bind(self) -> None:
|
||||
if self.address_family == socket.AF_INET6:
|
||||
# Only allow IPv6 connections to the IPv6 socket
|
||||
self.socket.setsockopt(COMPAT_IPPROTO_IPV6, socket.IPV6_V6ONLY, 1)
|
||||
super().server_bind()
|
||||
|
||||
def get_request(self):
|
||||
def get_request( # type:ignore[override]
|
||||
self) -> Tuple[socket.socket, Tuple[ADDRESS_TYPE, socket.socket]]:
|
||||
# Set timeout for client
|
||||
request, client_address = super().get_request()
|
||||
timeout = self.configuration.get("server", "timeout")
|
||||
if timeout:
|
||||
request.settimeout(timeout)
|
||||
client_socket, client_socket_out = socket.socketpair()
|
||||
self.client_sockets.add(client_socket_out)
|
||||
return request, (*client_address, client_socket)
|
||||
request: socket.socket
|
||||
client_address: ADDRESS_TYPE
|
||||
request, client_address = super().get_request() # type:ignore[misc]
|
||||
if self._timeout > 0:
|
||||
request.settimeout(self._timeout)
|
||||
worker_socket, worker_socket_out = socket.socketpair()
|
||||
self.worker_sockets.add(worker_socket_out)
|
||||
# HACK: Forward `worker_socket` via `client_address` return value
|
||||
# to worker thread.
|
||||
# The super class calls `verify_request`, `process_request` and
|
||||
# `handle_error` with modified `client_address` value.
|
||||
return request, (client_address, worker_socket)
|
||||
|
||||
def finish_request_locked(self, request, client_address):
|
||||
return super().finish_request(request, client_address)
|
||||
def verify_request( # type:ignore[override]
|
||||
self, request: socket.socket, client_address_and_socket:
|
||||
Tuple[ADDRESS_TYPE, socket.socket]) -> bool:
|
||||
return True
|
||||
|
||||
def finish_request(self, request, client_address):
|
||||
*client_address, client_socket = client_address
|
||||
client_address = tuple(client_address)
|
||||
def process_request( # type:ignore[override]
|
||||
self, request: socket.socket, client_address_and_socket:
|
||||
Tuple[ADDRESS_TYPE, socket.socket]) -> None:
|
||||
# HACK: Super class calls `finish_request` in new thread with
|
||||
# `client_address_and_socket`
|
||||
return super().process_request(
|
||||
request, client_address_and_socket) # type:ignore[arg-type]
|
||||
|
||||
def finish_request( # type:ignore[override]
|
||||
self, request: socket.socket, client_address_and_socket:
|
||||
Tuple[ADDRESS_TYPE, socket.socket]) -> None:
|
||||
# HACK: Unpack `client_address_and_socket` and call super class
|
||||
# `finish_request` with original `client_address`
|
||||
client_address, worker_socket = client_address_and_socket
|
||||
try:
|
||||
return self.finish_request_locked(request, client_address)
|
||||
finally:
|
||||
client_socket.close()
|
||||
worker_socket.close()
|
||||
|
||||
def handle_error(self, request, client_address):
|
||||
if issubclass(sys.exc_info()[0], socket.timeout):
|
||||
logger.info("client timed out", exc_info=True)
|
||||
def finish_request_locked(self, request: socket.socket,
|
||||
client_address: ADDRESS_TYPE) -> None:
|
||||
return super().finish_request(
|
||||
request, client_address) # type:ignore[arg-type]
|
||||
|
||||
def handle_error( # type:ignore[override]
|
||||
self, request: socket.socket,
|
||||
client_address_or_client_address_and_socket:
|
||||
Union[ADDRESS_TYPE, Tuple[ADDRESS_TYPE, socket.socket]]) -> None:
|
||||
# HACK: This method can be called with the modified
|
||||
# `client_address_and_socket` or the original `client_address` value
|
||||
e = sys.exc_info()[1]
|
||||
assert e is not None
|
||||
if isinstance(e, socket.timeout):
|
||||
logger.info("Client timed out", exc_info=True)
|
||||
else:
|
||||
logger.error("An exception occurred during request: %s",
|
||||
sys.exc_info()[1], exc_info=True)
|
||||
|
@ -105,12 +153,12 @@ class ParallelHTTPServer(socketserver.ThreadingMixIn,
|
|||
|
||||
class ParallelHTTPSServer(ParallelHTTPServer):
|
||||
|
||||
def server_bind(self):
|
||||
def server_bind(self) -> None:
|
||||
super().server_bind()
|
||||
# Wrap the TCP socket in an SSL socket
|
||||
certfile = self.configuration.get("server", "certificate")
|
||||
keyfile = self.configuration.get("server", "key")
|
||||
cafile = self.configuration.get("server", "certificate_authority")
|
||||
certfile: str = self.configuration.get("server", "certificate")
|
||||
keyfile: str = self.configuration.get("server", "key")
|
||||
cafile: str = self.configuration.get("server", "certificate_authority")
|
||||
# Test if the files can be read
|
||||
for name, filename in [("certificate", certfile), ("key", keyfile),
|
||||
("certificate_authority", cafile)]:
|
||||
|
@ -134,7 +182,9 @@ class ParallelHTTPSServer(ParallelHTTPServer):
|
|||
self.socket = context.wrap_socket(
|
||||
self.socket, server_side=True, do_handshake_on_connect=False)
|
||||
|
||||
def finish_request_locked(self, request, client_address):
|
||||
def finish_request_locked( # type:ignore[override]
|
||||
self, request: ssl.SSLSocket, client_address: ADDRESS_TYPE
|
||||
) -> None:
|
||||
try:
|
||||
try:
|
||||
request.do_handshake()
|
||||
|
@ -146,7 +196,7 @@ class ParallelHTTPSServer(ParallelHTTPServer):
|
|||
try:
|
||||
self.handle_error(request, client_address)
|
||||
finally:
|
||||
self.shutdown_request(request)
|
||||
self.shutdown_request(request) # type:ignore[attr-defined]
|
||||
return
|
||||
return super().finish_request_locked(request, client_address)
|
||||
|
||||
|
@ -154,32 +204,36 @@ class ParallelHTTPSServer(ParallelHTTPServer):
|
|||
class ServerHandler(wsgiref.simple_server.ServerHandler):
|
||||
|
||||
# Don't pollute WSGI environ with OS environment
|
||||
os_environ = {}
|
||||
os_environ: MutableMapping[str, str] = {}
|
||||
|
||||
def log_exception(self, exc_info):
|
||||
def log_exception(self, exc_info) -> None:
|
||||
logger.error("An exception occurred during request: %s",
|
||||
exc_info[1], exc_info=exc_info)
|
||||
exc_info[1], exc_info=exc_info) # type:ignore[arg-type]
|
||||
|
||||
|
||||
class RequestHandler(wsgiref.simple_server.WSGIRequestHandler):
|
||||
"""HTTP requests handler."""
|
||||
|
||||
def log_request(self, code="-", size="-"):
|
||||
# HACK: Assigned in `socketserver.StreamRequestHandler`
|
||||
connection: socket.socket
|
||||
|
||||
def log_request(self, code: Union[int, str] = "-",
|
||||
size: Union[int, str] = "-") -> None:
|
||||
pass # Disable request logging.
|
||||
|
||||
def log_error(self, format_, *args):
|
||||
def log_error(self, format_: str, *args: Any) -> None:
|
||||
logger.error("An error occurred during request: %s", format_ % args)
|
||||
|
||||
def get_environ(self):
|
||||
def get_environ(self) -> Dict[str, Any]:
|
||||
env = super().get_environ()
|
||||
if hasattr(self.connection, "getpeercert"):
|
||||
if isinstance(self.connection, ssl.SSLSocket):
|
||||
# The certificate can be evaluated by the auth module
|
||||
env["REMOTE_CERTIFICATE"] = self.connection.getpeercert()
|
||||
# Parent class only tries latin1 encoding
|
||||
env["PATH_INFO"] = unquote(self.path.split("?", 1)[0])
|
||||
return env
|
||||
|
||||
def handle(self):
|
||||
def handle(self) -> None:
|
||||
"""Copy of WSGIRequestHandler.handle with different ServerHandler"""
|
||||
|
||||
self.raw_requestline = self.rfile.readline(65537)
|
||||
|
@ -196,24 +250,35 @@ class RequestHandler(wsgiref.simple_server.WSGIRequestHandler):
|
|||
handler = ServerHandler(
|
||||
self.rfile, self.wfile, self.get_stderr(), self.get_environ()
|
||||
)
|
||||
handler.request_handler = self
|
||||
handler.run(self.server.get_app())
|
||||
handler.request_handler = self # type:ignore[attr-defined]
|
||||
app = self.server.get_app() # type:ignore[attr-defined]
|
||||
handler.run(app)
|
||||
|
||||
|
||||
def serve(configuration, shutdown_socket):
|
||||
"""Serve radicale from configuration."""
|
||||
def serve(configuration: config.Configuration,
|
||||
shutdown_socket: Optional[socket.socket] = None) -> None:
|
||||
"""Serve radicale from configuration.
|
||||
|
||||
`shutdown_socket` can be used to gracefully shutdown the server.
|
||||
The socket can be created with `socket.socketpair()`, when the other socket
|
||||
gets closed the server stops accepting new requests by clients and the
|
||||
function returns after all active requests are finished.
|
||||
|
||||
"""
|
||||
|
||||
logger.info("Starting Radicale")
|
||||
# Copy configuration before modifying
|
||||
configuration = configuration.copy()
|
||||
configuration.update({"server": {"_internal_server": "True"}}, "server",
|
||||
privileged=True)
|
||||
|
||||
use_ssl = configuration.get("server", "ssl")
|
||||
use_ssl: bool = configuration.get("server", "ssl")
|
||||
server_class = ParallelHTTPSServer if use_ssl else ParallelHTTPServer
|
||||
application = Application(configuration)
|
||||
servers = {}
|
||||
try:
|
||||
for address in configuration.get("server", "hosts"):
|
||||
hosts: List[Tuple[str, int]] = configuration.get("server", "hosts")
|
||||
for address in hosts:
|
||||
# Try to bind sockets for IPv4 and IPv6
|
||||
possible_families = (socket.AF_INET, socket.AF_INET6)
|
||||
bind_ok = False
|
||||
|
@ -240,7 +305,9 @@ def serve(configuration, shutdown_socket):
|
|||
# IPV6_V6ONLY set
|
||||
e.errno == errno.EADDRNOTAVAIL or
|
||||
# Address family not supported
|
||||
e.errno == errno.EAFNOSUPPORT)):
|
||||
e.errno == errno.EAFNOSUPPORT or
|
||||
# Protocol not supported
|
||||
e.errno == errno.EPROTONOSUPPORT)):
|
||||
continue
|
||||
raise RuntimeError("Failed to start server %r: %s" % (
|
||||
format_address(address), e)) from e
|
||||
|
@ -250,46 +317,48 @@ def serve(configuration, shutdown_socket):
|
|||
logger.info("Listening on %r%s",
|
||||
format_address(server.server_address),
|
||||
" with SSL" if use_ssl else "")
|
||||
assert servers, "no servers started"
|
||||
if not servers:
|
||||
raise RuntimeError("No servers started")
|
||||
|
||||
# Mainloop
|
||||
select_timeout = None
|
||||
if os.name == "nt":
|
||||
if sys.platform == "win32":
|
||||
# Fallback to busy waiting. (select(...) blocks SIGINT on Windows.)
|
||||
select_timeout = 1.0
|
||||
max_connections = configuration.get("server", "max_connections")
|
||||
max_connections: int = configuration.get("server", "max_connections")
|
||||
logger.info("Radicale server ready")
|
||||
while True:
|
||||
rlist = []
|
||||
rlist: List[socket.socket] = []
|
||||
# Wait for finished clients
|
||||
for server in servers.values():
|
||||
rlist.extend(server.client_sockets)
|
||||
rlist.extend(server.worker_sockets)
|
||||
# Accept new connections if max_connections is not reached
|
||||
if max_connections <= 0 or len(rlist) < max_connections:
|
||||
rlist.extend(servers)
|
||||
# Use socket to get notified of program shutdown
|
||||
rlist.append(shutdown_socket)
|
||||
if shutdown_socket is not None:
|
||||
rlist.append(shutdown_socket)
|
||||
rlist, _, _ = select.select(rlist, [], [], select_timeout)
|
||||
rlist = set(rlist)
|
||||
if shutdown_socket in rlist:
|
||||
rset = set(rlist)
|
||||
if shutdown_socket in rset:
|
||||
logger.info("Stopping Radicale")
|
||||
break
|
||||
for server in servers.values():
|
||||
finished_sockets = server.client_sockets.intersection(rlist)
|
||||
finished_sockets = server.worker_sockets.intersection(rset)
|
||||
for s in finished_sockets:
|
||||
s.close()
|
||||
server.client_sockets.remove(s)
|
||||
rlist.remove(s)
|
||||
server.worker_sockets.remove(s)
|
||||
rset.remove(s)
|
||||
if finished_sockets:
|
||||
server.service_actions()
|
||||
if rlist:
|
||||
server = servers.get(rlist.pop())
|
||||
if server:
|
||||
server.handle_request()
|
||||
if rset:
|
||||
active_server = servers.get(rset.pop())
|
||||
if active_server:
|
||||
active_server.handle_request()
|
||||
finally:
|
||||
# Wait for clients to finish and close servers
|
||||
for server in servers.values():
|
||||
for s in server.client_sockets:
|
||||
for s in server.worker_sockets:
|
||||
s.recv(1)
|
||||
s.close()
|
||||
server.server_close()
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
|
@ -23,37 +23,43 @@ Take a look at the class ``BaseCollection`` if you want to implement your own.
|
|||
|
||||
"""
|
||||
|
||||
import contextlib
|
||||
import json
|
||||
import xml.etree.ElementTree as ET
|
||||
from hashlib import sha256
|
||||
from typing import (Iterable, Iterator, Mapping, Optional, Sequence, Set,
|
||||
Tuple, Union, overload)
|
||||
|
||||
import pkg_resources
|
||||
import vobject
|
||||
|
||||
from radicale import utils
|
||||
from radicale import config
|
||||
from radicale import item as radicale_item
|
||||
from radicale import types, utils
|
||||
from radicale.item import filter as radicale_filter
|
||||
|
||||
INTERNAL_TYPES = ("multifilesystem",)
|
||||
INTERNAL_TYPES: Sequence[str] = ("multifilesystem", "multifilesystem_nolock",)
|
||||
|
||||
CACHE_DEPS = ("radicale", "vobject", "python-dateutil",)
|
||||
CACHE_VERSION = (";".join(pkg_resources.get_distribution(pkg).version
|
||||
for pkg in CACHE_DEPS) + ";").encode()
|
||||
CACHE_DEPS: Sequence[str] = ("radicale", "vobject", "python-dateutil",)
|
||||
CACHE_VERSION: bytes = "".join(
|
||||
"%s=%s;" % (pkg, utils.package_version(pkg))
|
||||
for pkg in CACHE_DEPS).encode()
|
||||
|
||||
|
||||
def load(configuration):
|
||||
def load(configuration: "config.Configuration") -> "BaseStorage":
|
||||
"""Load the storage module chosen in configuration."""
|
||||
return utils.load_plugin(
|
||||
INTERNAL_TYPES, "storage", "Storage", configuration)
|
||||
return utils.load_plugin(INTERNAL_TYPES, "storage", "Storage", BaseStorage,
|
||||
configuration)
|
||||
|
||||
|
||||
class ComponentExistsError(ValueError):
|
||||
def __init__(self, path):
|
||||
|
||||
def __init__(self, path: str) -> None:
|
||||
message = "Component already exists: %r" % path
|
||||
super().__init__(message)
|
||||
|
||||
|
||||
class ComponentNotFoundError(ValueError):
|
||||
def __init__(self, path):
|
||||
|
||||
def __init__(self, path: str) -> None:
|
||||
message = "Component doesn't exist: %r" % path
|
||||
super().__init__(message)
|
||||
|
||||
|
@ -61,47 +67,58 @@ class ComponentNotFoundError(ValueError):
|
|||
class BaseCollection:
|
||||
|
||||
@property
|
||||
def path(self):
|
||||
def path(self) -> str:
|
||||
"""The sanitized path of the collection without leading or
|
||||
trailing ``/``."""
|
||||
raise NotImplementedError
|
||||
|
||||
@property
|
||||
def owner(self):
|
||||
def owner(self) -> str:
|
||||
"""The owner of the collection."""
|
||||
return self.path.split("/", maxsplit=1)[0]
|
||||
|
||||
@property
|
||||
def is_principal(self):
|
||||
def is_principal(self) -> bool:
|
||||
"""Collection is a principal."""
|
||||
return bool(self.path) and "/" not in self.path
|
||||
|
||||
@property
|
||||
def etag(self):
|
||||
def etag(self) -> str:
|
||||
"""Encoded as quoted-string (see RFC 2616)."""
|
||||
etag = sha256()
|
||||
for item in self.get_all():
|
||||
assert item.href
|
||||
etag.update((item.href + "/" + item.etag).encode())
|
||||
etag.update(json.dumps(self.get_meta(), sort_keys=True).encode())
|
||||
return '"%s"' % etag.hexdigest()
|
||||
|
||||
def sync(self, old_token=None):
|
||||
@property
|
||||
def tag(self) -> str:
|
||||
"""The tag of the collection."""
|
||||
return self.get_meta("tag") or ""
|
||||
|
||||
def sync(self, old_token: str = "") -> Tuple[str, Iterable[str]]:
|
||||
"""Get the current sync token and changed items for synchronization.
|
||||
|
||||
``old_token`` an old sync token which is used as the base of the
|
||||
delta update. If sync token is missing, all items are returned.
|
||||
delta update. If sync token is empty, all items are returned.
|
||||
ValueError is raised for invalid or old tokens.
|
||||
|
||||
WARNING: This simple default implementation treats all sync-token as
|
||||
invalid.
|
||||
|
||||
"""
|
||||
def hrefs_iter() -> Iterator[str]:
|
||||
for item in self.get_all():
|
||||
assert item.href
|
||||
yield item.href
|
||||
token = "http://radicale.org/ns/sync/%s" % self.etag.strip("\"")
|
||||
if old_token:
|
||||
raise ValueError("Sync token are not supported")
|
||||
return token, (item.href for item in self.get_all())
|
||||
return token, hrefs_iter()
|
||||
|
||||
def get_multi(self, hrefs):
|
||||
def get_multi(self, hrefs: Iterable[str]
|
||||
) -> Iterable[Tuple[str, Optional["radicale_item.Item"]]]:
|
||||
"""Fetch multiple items.
|
||||
|
||||
It's not required to return the requested items in the correct order.
|
||||
|
@ -113,11 +130,12 @@ class BaseCollection:
|
|||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def get_all(self):
|
||||
def get_all(self) -> Iterable["radicale_item.Item"]:
|
||||
"""Fetch all items."""
|
||||
raise NotImplementedError
|
||||
|
||||
def get_filtered(self, filters):
|
||||
def get_filtered(self, filters: Iterable[ET.Element]
|
||||
) -> Iterable[Tuple["radicale_item.Item", bool]]:
|
||||
"""Fetch all items with optional filtering.
|
||||
|
||||
This can largely improve performance of reports depending on
|
||||
|
@ -128,32 +146,31 @@ class BaseCollection:
|
|||
matched.
|
||||
|
||||
"""
|
||||
if not self.tag:
|
||||
return
|
||||
tag, start, end, simple = radicale_filter.simplify_prefilters(
|
||||
filters, collection_tag=self.get_meta("tag"))
|
||||
filters, self.tag)
|
||||
for item in self.get_all():
|
||||
if tag:
|
||||
if tag != item.component_name:
|
||||
continue
|
||||
istart, iend = item.time_range
|
||||
if istart >= end or iend <= start:
|
||||
continue
|
||||
item_simple = simple and (start <= istart or iend <= end)
|
||||
else:
|
||||
item_simple = simple
|
||||
yield item, item_simple
|
||||
if tag is not None and tag != item.component_name:
|
||||
continue
|
||||
istart, iend = item.time_range
|
||||
if istart >= end or iend <= start:
|
||||
continue
|
||||
yield item, simple and (start <= istart or iend <= end)
|
||||
|
||||
def has_uid(self, uid):
|
||||
def has_uid(self, uid: str) -> bool:
|
||||
"""Check if a UID exists in the collection."""
|
||||
for item in self.get_all():
|
||||
if item.uid == uid:
|
||||
return True
|
||||
return False
|
||||
|
||||
def upload(self, href, item):
|
||||
def upload(self, href: str, item: "radicale_item.Item") -> (
|
||||
"radicale_item.Item"):
|
||||
"""Upload a new or replace an existing item."""
|
||||
raise NotImplementedError
|
||||
|
||||
def delete(self, href=None):
|
||||
def delete(self, href: Optional[str] = None) -> None:
|
||||
"""Delete an item.
|
||||
|
||||
When ``href`` is ``None``, delete the collection.
|
||||
|
@ -161,7 +178,14 @@ class BaseCollection:
|
|||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def get_meta(self, key=None):
|
||||
@overload
|
||||
def get_meta(self, key: None = None) -> Mapping[str, str]: ...
|
||||
|
||||
@overload
|
||||
def get_meta(self, key: str) -> Optional[str]: ...
|
||||
|
||||
def get_meta(self, key: Optional[str] = None
|
||||
) -> Union[Mapping[str, str], Optional[str]]:
|
||||
"""Get metadata value for collection.
|
||||
|
||||
Return the value of the property ``key``. If ``key`` is ``None`` return
|
||||
|
@ -170,7 +194,7 @@ class BaseCollection:
|
|||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def set_meta(self, props):
|
||||
def set_meta(self, props: Mapping[str, str]) -> None:
|
||||
"""Set metadata values for collection.
|
||||
|
||||
``props`` a dict with values for properties.
|
||||
|
@ -179,23 +203,23 @@ class BaseCollection:
|
|||
raise NotImplementedError
|
||||
|
||||
@property
|
||||
def last_modified(self):
|
||||
def last_modified(self) -> str:
|
||||
"""Get the HTTP-datetime of when the collection was modified."""
|
||||
raise NotImplementedError
|
||||
|
||||
def serialize(self):
|
||||
def serialize(self) -> str:
|
||||
"""Get the unicode string representing the whole collection."""
|
||||
if self.get_meta("tag") == "VCALENDAR":
|
||||
if self.tag == "VCALENDAR":
|
||||
in_vcalendar = False
|
||||
vtimezones = ""
|
||||
included_tzids = set()
|
||||
included_tzids: Set[str] = set()
|
||||
vtimezone = []
|
||||
tzid = None
|
||||
components = ""
|
||||
# Concatenate all child elements of VCALENDAR from all items
|
||||
# together, while preventing duplicated VTIMEZONE entries.
|
||||
# VTIMEZONEs are only distinguished by their TZID, if different
|
||||
# timezones share the same TZID this produces errornous ouput.
|
||||
# timezones share the same TZID this produces erroneous output.
|
||||
# VObject fails at this too.
|
||||
for item in self.get_all():
|
||||
depth = 0
|
||||
|
@ -216,6 +240,7 @@ class BaseCollection:
|
|||
elif depth == 2 and line.startswith("END:"):
|
||||
if tzid is None or tzid not in included_tzids:
|
||||
vtimezones += "".join(vtimezone)
|
||||
if tzid is not None:
|
||||
included_tzids.add(tzid)
|
||||
vtimezone.clear()
|
||||
tzid = None
|
||||
|
@ -240,13 +265,14 @@ class BaseCollection:
|
|||
return (template[:template_insert_pos] +
|
||||
vtimezones + components +
|
||||
template[template_insert_pos:])
|
||||
if self.get_meta("tag") == "VADDRESSBOOK":
|
||||
if self.tag == "VADDRESSBOOK":
|
||||
return "".join((item.serialize() for item in self.get_all()))
|
||||
return ""
|
||||
|
||||
|
||||
class BaseStorage:
|
||||
def __init__(self, configuration):
|
||||
|
||||
def __init__(self, configuration: "config.Configuration") -> None:
|
||||
"""Initialize BaseStorage.
|
||||
|
||||
``configuration`` see ``radicale.config`` module.
|
||||
|
@ -256,7 +282,8 @@ class BaseStorage:
|
|||
"""
|
||||
self.configuration = configuration
|
||||
|
||||
def discover(self, path, depth="0"):
|
||||
def discover(self, path: str, depth: str = "0") -> Iterable[
|
||||
"types.CollectionOrItem"]:
|
||||
"""Discover a list of collections under the given ``path``.
|
||||
|
||||
``path`` is sanitized.
|
||||
|
@ -272,7 +299,8 @@ class BaseStorage:
|
|||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def move(self, item, to_collection, to_href):
|
||||
def move(self, item: "radicale_item.Item", to_collection: BaseCollection,
|
||||
to_href: str) -> None:
|
||||
"""Move an object.
|
||||
|
||||
``item`` is the item to move.
|
||||
|
@ -285,7 +313,10 @@ class BaseStorage:
|
|||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def create_collection(self, href, items=None, props=None):
|
||||
def create_collection(
|
||||
self, href: str,
|
||||
items: Optional[Iterable["radicale_item.Item"]] = None,
|
||||
props: Optional[Mapping[str, str]] = None) -> BaseCollection:
|
||||
"""Create a collection.
|
||||
|
||||
``href`` is the sanitized path.
|
||||
|
@ -298,15 +329,14 @@ class BaseStorage:
|
|||
|
||||
``props`` are metadata values for the collection.
|
||||
|
||||
``props["tag"]`` is the type of collection (VCALENDAR or
|
||||
VADDRESSBOOK). If the key ``tag`` is missing, it is guessed from the
|
||||
collection.
|
||||
``props["tag"]`` is the type of collection (VCALENDAR or VADDRESSBOOK).
|
||||
If the key ``tag`` is missing, ``items`` is ignored.
|
||||
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
@contextlib.contextmanager
|
||||
def acquire_lock(self, mode, user=None):
|
||||
@types.contextmanager
|
||||
def acquire_lock(self, mode: str, user: str = "") -> Iterator[None]:
|
||||
"""Set a context manager to lock the whole storage.
|
||||
|
||||
``mode`` must either be "r" for shared access or "w" for exclusive
|
||||
|
@ -317,6 +347,6 @@ class BaseStorage:
|
|||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def verify(self):
|
||||
def verify(self) -> bool:
|
||||
"""Check the storage for errors."""
|
||||
raise NotImplementedError
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2019 Unrud <unrud@outlook.com>
|
||||
|
@ -23,75 +23,57 @@ Uses one folder per collection and one file per collection entry.
|
|||
|
||||
"""
|
||||
|
||||
import contextlib
|
||||
import os
|
||||
import time
|
||||
from itertools import chain
|
||||
from tempfile import TemporaryDirectory
|
||||
from typing import ClassVar, Iterator, Optional, Type
|
||||
|
||||
from radicale import pathutils, storage
|
||||
from radicale.storage.multifilesystem.cache import CollectionCacheMixin
|
||||
from radicale import config
|
||||
from radicale.storage.multifilesystem.base import CollectionBase, StorageBase
|
||||
from radicale.storage.multifilesystem.cache import CollectionPartCache
|
||||
from radicale.storage.multifilesystem.create_collection import \
|
||||
StorageCreateCollectionMixin
|
||||
from radicale.storage.multifilesystem.delete import CollectionDeleteMixin
|
||||
from radicale.storage.multifilesystem.discover import StorageDiscoverMixin
|
||||
from radicale.storage.multifilesystem.get import CollectionGetMixin
|
||||
from radicale.storage.multifilesystem.history import CollectionHistoryMixin
|
||||
from radicale.storage.multifilesystem.lock import (CollectionLockMixin,
|
||||
StorageLockMixin)
|
||||
from radicale.storage.multifilesystem.meta import CollectionMetaMixin
|
||||
from radicale.storage.multifilesystem.move import StorageMoveMixin
|
||||
from radicale.storage.multifilesystem.sync import CollectionSyncMixin
|
||||
from radicale.storage.multifilesystem.upload import CollectionUploadMixin
|
||||
from radicale.storage.multifilesystem.verify import StorageVerifyMixin
|
||||
StoragePartCreateCollection
|
||||
from radicale.storage.multifilesystem.delete import CollectionPartDelete
|
||||
from radicale.storage.multifilesystem.discover import StoragePartDiscover
|
||||
from radicale.storage.multifilesystem.get import CollectionPartGet
|
||||
from radicale.storage.multifilesystem.history import CollectionPartHistory
|
||||
from radicale.storage.multifilesystem.lock import (CollectionPartLock,
|
||||
StoragePartLock)
|
||||
from radicale.storage.multifilesystem.meta import CollectionPartMeta
|
||||
from radicale.storage.multifilesystem.move import StoragePartMove
|
||||
from radicale.storage.multifilesystem.sync import CollectionPartSync
|
||||
from radicale.storage.multifilesystem.upload import CollectionPartUpload
|
||||
from radicale.storage.multifilesystem.verify import StoragePartVerify
|
||||
|
||||
|
||||
class Collection(
|
||||
CollectionCacheMixin, CollectionDeleteMixin, CollectionGetMixin,
|
||||
CollectionHistoryMixin, CollectionLockMixin, CollectionMetaMixin,
|
||||
CollectionSyncMixin, CollectionUploadMixin, storage.BaseCollection):
|
||||
CollectionPartDelete, CollectionPartMeta, CollectionPartSync,
|
||||
CollectionPartUpload, CollectionPartGet, CollectionPartCache,
|
||||
CollectionPartLock, CollectionPartHistory, CollectionBase):
|
||||
|
||||
def __init__(self, storage_, path, filesystem_path=None):
|
||||
self._storage = storage_
|
||||
folder = self._storage._get_collection_root_folder()
|
||||
# Path should already be sanitized
|
||||
self._path = pathutils.strip_path(path)
|
||||
self._encoding = self._storage.configuration.get("encoding", "stock")
|
||||
if filesystem_path is None:
|
||||
filesystem_path = pathutils.path_to_filesystem(folder, self.path)
|
||||
self._filesystem_path = filesystem_path
|
||||
_etag_cache: Optional[str]
|
||||
|
||||
def __init__(self, storage_: "Storage", path: str,
|
||||
filesystem_path: Optional[str] = None) -> None:
|
||||
super().__init__(storage_, path, filesystem_path)
|
||||
self._etag_cache = None
|
||||
super().__init__()
|
||||
|
||||
@property
|
||||
def path(self):
|
||||
def path(self) -> str:
|
||||
return self._path
|
||||
|
||||
@contextlib.contextmanager
|
||||
def _atomic_write(self, path, mode="w", newline=None):
|
||||
parent_dir, name = os.path.split(path)
|
||||
# Do not use mkstemp because it creates with permissions 0o600
|
||||
with TemporaryDirectory(
|
||||
prefix=".Radicale.tmp-", dir=parent_dir) as tmp_dir:
|
||||
with open(os.path.join(tmp_dir, name), mode, newline=newline,
|
||||
encoding=None if "b" in mode else self._encoding) as tmp:
|
||||
yield tmp
|
||||
tmp.flush()
|
||||
self._storage._fsync(tmp)
|
||||
os.replace(os.path.join(tmp_dir, name), path)
|
||||
self._storage._sync_directory(parent_dir)
|
||||
|
||||
@property
|
||||
def last_modified(self):
|
||||
relevant_files = chain(
|
||||
(self._filesystem_path,),
|
||||
(self._props_path,) if os.path.exists(self._props_path) else (),
|
||||
(os.path.join(self._filesystem_path, h) for h in self._list()))
|
||||
last = max(map(os.path.getmtime, relevant_files))
|
||||
def last_modified(self) -> str:
|
||||
def relevant_files_iter() -> Iterator[str]:
|
||||
yield self._filesystem_path
|
||||
if os.path.exists(self._props_path):
|
||||
yield self._props_path
|
||||
for href in self._list():
|
||||
yield os.path.join(self._filesystem_path, href)
|
||||
last = max(map(os.path.getmtime, relevant_files_iter()))
|
||||
return time.strftime("%a, %d %b %Y %H:%M:%S GMT", time.gmtime(last))
|
||||
|
||||
@property
|
||||
def etag(self):
|
||||
def etag(self) -> str:
|
||||
# reuse cached value if the storage is read-only
|
||||
if self._storage._lock.locked == "w" or self._etag_cache is None:
|
||||
self._etag_cache = super().etag
|
||||
|
@ -99,61 +81,11 @@ class Collection(
|
|||
|
||||
|
||||
class Storage(
|
||||
StorageCreateCollectionMixin, StorageDiscoverMixin, StorageLockMixin,
|
||||
StorageMoveMixin, StorageVerifyMixin, storage.BaseStorage):
|
||||
StoragePartCreateCollection, StoragePartLock, StoragePartMove,
|
||||
StoragePartVerify, StoragePartDiscover, StorageBase):
|
||||
|
||||
_collection_class = Collection
|
||||
_collection_class: ClassVar[Type[Collection]] = Collection
|
||||
|
||||
def __init__(self, configuration):
|
||||
def __init__(self, configuration: config.Configuration) -> None:
|
||||
super().__init__(configuration)
|
||||
folder = configuration.get("storage", "filesystem_folder")
|
||||
self._makedirs_synced(folder)
|
||||
|
||||
def _get_collection_root_folder(self):
|
||||
filesystem_folder = self.configuration.get(
|
||||
"storage", "filesystem_folder")
|
||||
return os.path.join(filesystem_folder, "collection-root")
|
||||
|
||||
def _fsync(self, f):
|
||||
if self.configuration.get("storage", "_filesystem_fsync"):
|
||||
try:
|
||||
pathutils.fsync(f.fileno())
|
||||
except OSError as e:
|
||||
raise RuntimeError("Fsync'ing file %r failed: %s" %
|
||||
(f.name, e)) from e
|
||||
|
||||
def _sync_directory(self, path):
|
||||
"""Sync directory to disk.
|
||||
|
||||
This only works on POSIX and does nothing on other systems.
|
||||
|
||||
"""
|
||||
if not self.configuration.get("storage", "_filesystem_fsync"):
|
||||
return
|
||||
if os.name == "posix":
|
||||
try:
|
||||
fd = os.open(path, 0)
|
||||
try:
|
||||
pathutils.fsync(fd)
|
||||
finally:
|
||||
os.close(fd)
|
||||
except OSError as e:
|
||||
raise RuntimeError("Fsync'ing directory %r failed: %s" %
|
||||
(path, e)) from e
|
||||
|
||||
def _makedirs_synced(self, filesystem_path):
|
||||
"""Recursively create a directory and its parents in a sync'ed way.
|
||||
|
||||
This method acts silently when the folder already exists.
|
||||
|
||||
"""
|
||||
if os.path.isdir(filesystem_path):
|
||||
return
|
||||
parent_filesystem_path = os.path.dirname(filesystem_path)
|
||||
# Prevent infinite loop
|
||||
if filesystem_path != parent_filesystem_path:
|
||||
# Create parent dirs recursively
|
||||
self._makedirs_synced(parent_filesystem_path)
|
||||
# Possible race!
|
||||
os.makedirs(filesystem_path, exist_ok=True)
|
||||
self._sync_directory(parent_filesystem_path)
|
||||
self._makedirs_synced(self._filesystem_folder)
|
||||
|
|
123
radicale/storage/multifilesystem/base.py
Normal file
123
radicale/storage/multifilesystem/base.py
Normal file
|
@ -0,0 +1,123 @@
|
|||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2019 Unrud <unrud@outlook.com>
|
||||
#
|
||||
# This library is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with Radicale. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import os
|
||||
import sys
|
||||
from tempfile import TemporaryDirectory
|
||||
from typing import IO, AnyStr, ClassVar, Iterator, Optional, Type
|
||||
|
||||
from radicale import config, pathutils, storage, types
|
||||
from radicale.storage import multifilesystem # noqa:F401
|
||||
|
||||
|
||||
class CollectionBase(storage.BaseCollection):
|
||||
|
||||
_storage: "multifilesystem.Storage"
|
||||
_path: str
|
||||
_encoding: str
|
||||
_filesystem_path: str
|
||||
|
||||
def __init__(self, storage_: "multifilesystem.Storage", path: str,
|
||||
filesystem_path: Optional[str] = None) -> None:
|
||||
super().__init__()
|
||||
self._storage = storage_
|
||||
folder = storage_._get_collection_root_folder()
|
||||
# Path should already be sanitized
|
||||
self._path = pathutils.strip_path(path)
|
||||
self._encoding = storage_.configuration.get("encoding", "stock")
|
||||
if filesystem_path is None:
|
||||
filesystem_path = pathutils.path_to_filesystem(folder, self.path)
|
||||
self._filesystem_path = filesystem_path
|
||||
|
||||
@types.contextmanager
|
||||
def _atomic_write(self, path: str, mode: str = "w",
|
||||
newline: Optional[str] = None) -> Iterator[IO[AnyStr]]:
|
||||
# TODO: Overload with Literal when dropping support for Python < 3.8
|
||||
parent_dir, name = os.path.split(path)
|
||||
# Do not use mkstemp because it creates with permissions 0o600
|
||||
with TemporaryDirectory(
|
||||
prefix=".Radicale.tmp-", dir=parent_dir) as tmp_dir:
|
||||
with open(os.path.join(tmp_dir, name), mode, newline=newline,
|
||||
encoding=None if "b" in mode else self._encoding) as tmp:
|
||||
yield tmp
|
||||
tmp.flush()
|
||||
self._storage._fsync(tmp)
|
||||
os.replace(os.path.join(tmp_dir, name), path)
|
||||
self._storage._sync_directory(parent_dir)
|
||||
|
||||
|
||||
class StorageBase(storage.BaseStorage):
|
||||
|
||||
_collection_class: ClassVar[Type["multifilesystem.Collection"]]
|
||||
|
||||
_filesystem_folder: str
|
||||
_filesystem_fsync: bool
|
||||
|
||||
def __init__(self, configuration: config.Configuration) -> None:
|
||||
super().__init__(configuration)
|
||||
self._filesystem_folder = configuration.get(
|
||||
"storage", "filesystem_folder")
|
||||
self._filesystem_fsync = configuration.get(
|
||||
"storage", "_filesystem_fsync")
|
||||
|
||||
def _get_collection_root_folder(self) -> str:
|
||||
return os.path.join(self._filesystem_folder, "collection-root")
|
||||
|
||||
def _fsync(self, f: IO[AnyStr]) -> None:
|
||||
if self._filesystem_fsync:
|
||||
try:
|
||||
pathutils.fsync(f.fileno())
|
||||
except OSError as e:
|
||||
raise RuntimeError("Fsync'ing file %r failed: %s" %
|
||||
(f.name, e)) from e
|
||||
|
||||
def _sync_directory(self, path: str) -> None:
|
||||
"""Sync directory to disk.
|
||||
|
||||
This only works on POSIX and does nothing on other systems.
|
||||
|
||||
"""
|
||||
if not self._filesystem_fsync:
|
||||
return
|
||||
if sys.platform != "win32":
|
||||
try:
|
||||
fd = os.open(path, 0)
|
||||
try:
|
||||
pathutils.fsync(fd)
|
||||
finally:
|
||||
os.close(fd)
|
||||
except OSError as e:
|
||||
raise RuntimeError("Fsync'ing directory %r failed: %s" %
|
||||
(path, e)) from e
|
||||
|
||||
def _makedirs_synced(self, filesystem_path: str) -> None:
|
||||
"""Recursively create a directory and its parents in a sync'ed way.
|
||||
|
||||
This method acts silently when the folder already exists.
|
||||
|
||||
"""
|
||||
if os.path.isdir(filesystem_path):
|
||||
return
|
||||
parent_filesystem_path = os.path.dirname(filesystem_path)
|
||||
# Prevent infinite loop
|
||||
if filesystem_path != parent_filesystem_path:
|
||||
# Create parent dirs recursively
|
||||
self._makedirs_synced(parent_filesystem_path)
|
||||
# Possible race!
|
||||
os.makedirs(filesystem_path, exist_ok=True)
|
||||
self._sync_directory(parent_filesystem_path)
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
|
@ -16,20 +16,32 @@
|
|||
# You should have received a copy of the GNU General Public License
|
||||
# along with Radicale. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import contextlib
|
||||
import os
|
||||
import pickle
|
||||
import time
|
||||
from hashlib import sha256
|
||||
from typing import BinaryIO, Iterable, NamedTuple, Optional, cast
|
||||
|
||||
import radicale.item as radicale_item
|
||||
from radicale import pathutils, storage
|
||||
from radicale.log import logger
|
||||
from radicale.storage.multifilesystem.base import CollectionBase
|
||||
|
||||
CacheContent = NamedTuple("CacheContent", [
|
||||
("uid", str), ("etag", str), ("text", str), ("name", str), ("tag", str),
|
||||
("start", int), ("end", int)])
|
||||
|
||||
|
||||
class CollectionCacheMixin:
|
||||
def _clean_cache(self, folder, names, max_age=None):
|
||||
class CollectionPartCache(CollectionBase):
|
||||
|
||||
def _clean_cache(self, folder: str, names: Iterable[str],
|
||||
max_age: int = 0) -> None:
|
||||
"""Delete all ``names`` in ``folder`` that are older than ``max_age``.
|
||||
"""
|
||||
age_limit = time.time() - max_age if max_age is not None else None
|
||||
age_limit: Optional[float] = None
|
||||
if max_age is not None and max_age > 0:
|
||||
age_limit = time.time() - max_age
|
||||
modified = False
|
||||
for name in names:
|
||||
if not pathutils.is_safe_filesystem_path_component(name):
|
||||
|
@ -54,51 +66,49 @@ class CollectionCacheMixin:
|
|||
self._storage._sync_directory(folder)
|
||||
|
||||
@staticmethod
|
||||
def _item_cache_hash(raw_text):
|
||||
def _item_cache_hash(raw_text: bytes) -> str:
|
||||
_hash = sha256()
|
||||
_hash.update(storage.CACHE_VERSION)
|
||||
_hash.update(raw_text)
|
||||
return _hash.hexdigest()
|
||||
|
||||
def _item_cache_content(self, item, cache_hash=None):
|
||||
text = item.serialize()
|
||||
if cache_hash is None:
|
||||
cache_hash = self._item_cache_hash(text.encode(self._encoding))
|
||||
return (cache_hash, item.uid, item.etag, text, item.name,
|
||||
item.component_name, *item.time_range)
|
||||
def _item_cache_content(self, item: radicale_item.Item) -> CacheContent:
|
||||
return CacheContent(item.uid, item.etag, item.serialize(), item.name,
|
||||
item.component_name, *item.time_range)
|
||||
|
||||
def _store_item_cache(self, href, item, cache_hash=None):
|
||||
def _store_item_cache(self, href: str, item: radicale_item.Item,
|
||||
cache_hash: str = "") -> CacheContent:
|
||||
if not cache_hash:
|
||||
cache_hash = self._item_cache_hash(
|
||||
item.serialize().encode(self._encoding))
|
||||
cache_folder = os.path.join(self._filesystem_path, ".Radicale.cache",
|
||||
"item")
|
||||
content = self._item_cache_content(item, cache_hash)
|
||||
content = self._item_cache_content(item)
|
||||
self._storage._makedirs_synced(cache_folder)
|
||||
try:
|
||||
# Race: Other processes might have created and locked the
|
||||
# file.
|
||||
with self._atomic_write(os.path.join(cache_folder, href),
|
||||
"wb") as f:
|
||||
pickle.dump(content, f)
|
||||
except PermissionError:
|
||||
pass
|
||||
# Race: Other processes might have created and locked the file.
|
||||
with contextlib.suppress(PermissionError), self._atomic_write(
|
||||
os.path.join(cache_folder, href), "wb") as fo:
|
||||
fb = cast(BinaryIO, fo)
|
||||
pickle.dump((cache_hash, *content), fb)
|
||||
return content
|
||||
|
||||
def _load_item_cache(self, href, input_hash):
|
||||
def _load_item_cache(self, href: str, cache_hash: str
|
||||
) -> Optional[CacheContent]:
|
||||
cache_folder = os.path.join(self._filesystem_path, ".Radicale.cache",
|
||||
"item")
|
||||
cache_hash = uid = etag = text = name = tag = start = end = None
|
||||
try:
|
||||
with open(os.path.join(cache_folder, href), "rb") as f:
|
||||
cache_hash, *content = pickle.load(f)
|
||||
if cache_hash == input_hash:
|
||||
uid, etag, text, name, tag, start, end = content
|
||||
hash_, *remainder = pickle.load(f)
|
||||
if hash_ and hash_ == cache_hash:
|
||||
return CacheContent(*remainder)
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
except (pickle.UnpicklingError, ValueError) as e:
|
||||
logger.warning("Failed to load item cache entry %r in %r: %s",
|
||||
href, self.path, e, exc_info=True)
|
||||
return cache_hash, uid, etag, text, name, tag, start, end
|
||||
return None
|
||||
|
||||
def _clean_item_cache(self):
|
||||
def _clean_item_cache(self) -> None:
|
||||
cache_folder = os.path.join(self._filesystem_path, ".Radicale.cache",
|
||||
"item")
|
||||
self._clean_cache(cache_folder, (
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
|
@ -18,13 +18,19 @@
|
|||
|
||||
import os
|
||||
from tempfile import TemporaryDirectory
|
||||
from typing import Iterable, Optional, cast
|
||||
|
||||
import radicale.item as radicale_item
|
||||
from radicale import pathutils
|
||||
from radicale.storage import multifilesystem
|
||||
from radicale.storage.multifilesystem.base import StorageBase
|
||||
|
||||
|
||||
class StorageCreateCollectionMixin:
|
||||
class StoragePartCreateCollection(StorageBase):
|
||||
|
||||
def create_collection(self, href, items=None, props=None):
|
||||
def create_collection(self, href: str,
|
||||
items: Optional[Iterable[radicale_item.Item]] = None,
|
||||
props=None) -> "multifilesystem.Collection":
|
||||
folder = self._get_collection_root_folder()
|
||||
|
||||
# Path should already be sanitized
|
||||
|
@ -34,19 +40,21 @@ class StorageCreateCollectionMixin:
|
|||
if not props:
|
||||
self._makedirs_synced(filesystem_path)
|
||||
return self._collection_class(
|
||||
self, pathutils.unstrip_path(sane_path, True))
|
||||
cast(multifilesystem.Storage, self),
|
||||
pathutils.unstrip_path(sane_path, True))
|
||||
|
||||
parent_dir = os.path.dirname(filesystem_path)
|
||||
self._makedirs_synced(parent_dir)
|
||||
|
||||
# Create a temporary directory with an unsafe name
|
||||
with TemporaryDirectory(
|
||||
prefix=".Radicale.tmp-", dir=parent_dir) as tmp_dir:
|
||||
with TemporaryDirectory(prefix=".Radicale.tmp-", dir=parent_dir
|
||||
) as tmp_dir:
|
||||
# The temporary directory itself can't be renamed
|
||||
tmp_filesystem_path = os.path.join(tmp_dir, "collection")
|
||||
os.makedirs(tmp_filesystem_path)
|
||||
col = self._collection_class(
|
||||
self, pathutils.unstrip_path(sane_path, True),
|
||||
cast(multifilesystem.Storage, self),
|
||||
pathutils.unstrip_path(sane_path, True),
|
||||
filesystem_path=tmp_filesystem_path)
|
||||
col.set_meta(props)
|
||||
if items is not None:
|
||||
|
@ -55,13 +63,12 @@ class StorageCreateCollectionMixin:
|
|||
elif props.get("tag") == "VADDRESSBOOK":
|
||||
col._upload_all_nonatomic(items, suffix=".vcf")
|
||||
|
||||
# This operation is not atomic on the filesystem level but it's
|
||||
# very unlikely that one rename operations succeeds while the
|
||||
# other fails or that only one gets written to disk.
|
||||
if os.path.exists(filesystem_path):
|
||||
os.rename(filesystem_path, os.path.join(tmp_dir, "delete"))
|
||||
os.rename(tmp_filesystem_path, filesystem_path)
|
||||
if os.path.lexists(filesystem_path):
|
||||
pathutils.rename_exchange(tmp_filesystem_path, filesystem_path)
|
||||
else:
|
||||
os.rename(tmp_filesystem_path, filesystem_path)
|
||||
self._sync_directory(parent_dir)
|
||||
|
||||
return self._collection_class(
|
||||
self, pathutils.unstrip_path(sane_path, True))
|
||||
cast(multifilesystem.Storage, self),
|
||||
pathutils.unstrip_path(sane_path, True))
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
|
@ -18,20 +18,24 @@
|
|||
|
||||
import os
|
||||
from tempfile import TemporaryDirectory
|
||||
from typing import Optional
|
||||
|
||||
from radicale import pathutils, storage
|
||||
from radicale.storage.multifilesystem.base import CollectionBase
|
||||
from radicale.storage.multifilesystem.history import CollectionPartHistory
|
||||
|
||||
|
||||
class CollectionDeleteMixin:
|
||||
def delete(self, href=None):
|
||||
class CollectionPartDelete(CollectionPartHistory, CollectionBase):
|
||||
|
||||
def delete(self, href: Optional[str] = None) -> None:
|
||||
if href is None:
|
||||
# Delete the collection
|
||||
parent_dir = os.path.dirname(self._filesystem_path)
|
||||
try:
|
||||
os.rmdir(self._filesystem_path)
|
||||
except OSError:
|
||||
with TemporaryDirectory(
|
||||
prefix=".Radicale.tmp-", dir=parent_dir) as tmp:
|
||||
with TemporaryDirectory(prefix=".Radicale.tmp-", dir=parent_dir
|
||||
) as tmp:
|
||||
os.rename(self._filesystem_path, os.path.join(
|
||||
tmp, os.path.basename(self._filesystem_path)))
|
||||
self._storage._sync_directory(parent_dir)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
|
@ -16,18 +16,31 @@
|
|||
# You should have received a copy of the GNU General Public License
|
||||
# along with Radicale. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import contextlib
|
||||
import os
|
||||
import posixpath
|
||||
from typing import Callable, ContextManager, Iterator, Optional, cast
|
||||
|
||||
from radicale import pathutils
|
||||
from radicale import pathutils, types
|
||||
from radicale.log import logger
|
||||
from radicale.storage import multifilesystem
|
||||
from radicale.storage.multifilesystem.base import StorageBase
|
||||
|
||||
|
||||
class StorageDiscoverMixin:
|
||||
@types.contextmanager
|
||||
def _null_child_context_manager(path: str,
|
||||
href: Optional[str]) -> Iterator[None]:
|
||||
yield
|
||||
|
||||
def discover(self, path, depth="0", child_context_manager=(
|
||||
lambda path, href=None: contextlib.ExitStack())):
|
||||
|
||||
class StoragePartDiscover(StorageBase):
|
||||
|
||||
def discover(
|
||||
self, path: str, depth: str = "0", child_context_manager: Optional[
|
||||
Callable[[str, Optional[str]], ContextManager[None]]] = None
|
||||
) -> Iterator[types.CollectionOrItem]:
|
||||
# assert isinstance(self, multifilesystem.Storage)
|
||||
if child_context_manager is None:
|
||||
child_context_manager = _null_child_context_manager
|
||||
# Path should already be sanitized
|
||||
sane_path = pathutils.strip_path(path)
|
||||
attributes = sane_path.split("/") if sane_path else []
|
||||
|
@ -44,6 +57,7 @@ class StorageDiscoverMixin:
|
|||
return
|
||||
|
||||
# Check if the path exists and if it leads to a collection or an item
|
||||
href: Optional[str]
|
||||
if not os.path.isdir(filesystem_path):
|
||||
if attributes and os.path.isfile(filesystem_path):
|
||||
href = attributes.pop()
|
||||
|
@ -54,10 +68,13 @@ class StorageDiscoverMixin:
|
|||
|
||||
sane_path = "/".join(attributes)
|
||||
collection = self._collection_class(
|
||||
self, pathutils.unstrip_path(sane_path, True))
|
||||
cast(multifilesystem.Storage, self),
|
||||
pathutils.unstrip_path(sane_path, True))
|
||||
|
||||
if href:
|
||||
yield collection._get(href)
|
||||
item = collection._get(href)
|
||||
if item is not None:
|
||||
yield item
|
||||
return
|
||||
|
||||
yield collection
|
||||
|
@ -67,7 +84,9 @@ class StorageDiscoverMixin:
|
|||
|
||||
for href in collection._list():
|
||||
with child_context_manager(sane_path, href):
|
||||
yield collection._get(href)
|
||||
item = collection._get(href)
|
||||
if item is not None:
|
||||
yield item
|
||||
|
||||
for entry in os.scandir(filesystem_path):
|
||||
if not entry.is_dir():
|
||||
|
@ -80,5 +99,6 @@ class StorageDiscoverMixin:
|
|||
continue
|
||||
sane_child_path = posixpath.join(sane_path, href)
|
||||
child_path = pathutils.unstrip_path(sane_child_path, True)
|
||||
with child_context_manager(sane_child_path):
|
||||
yield self._collection_class(self, child_path)
|
||||
with child_context_manager(sane_child_path, None):
|
||||
yield self._collection_class(
|
||||
cast(multifilesystem.Storage, self), child_path)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
|
@ -17,21 +17,30 @@
|
|||
# along with Radicale. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import os
|
||||
import sys
|
||||
import time
|
||||
from typing import Iterable, Iterator, Optional, Tuple
|
||||
|
||||
import vobject
|
||||
|
||||
from radicale import item as radicale_item
|
||||
import radicale.item as radicale_item
|
||||
from radicale import pathutils
|
||||
from radicale.log import logger
|
||||
from radicale.storage import multifilesystem
|
||||
from radicale.storage.multifilesystem.base import CollectionBase
|
||||
from radicale.storage.multifilesystem.cache import CollectionPartCache
|
||||
from radicale.storage.multifilesystem.lock import CollectionPartLock
|
||||
|
||||
|
||||
class CollectionGetMixin:
|
||||
def __init__(self):
|
||||
super().__init__()
|
||||
class CollectionPartGet(CollectionPartCache, CollectionPartLock,
|
||||
CollectionBase):
|
||||
|
||||
_item_cache_cleaned: bool
|
||||
|
||||
def __init__(self, storage_: "multifilesystem.Storage", path: str,
|
||||
filesystem_path: Optional[str] = None) -> None:
|
||||
super().__init__(storage_, path, filesystem_path)
|
||||
self._item_cache_cleaned = False
|
||||
|
||||
def _list(self):
|
||||
def _list(self) -> Iterator[str]:
|
||||
for entry in os.scandir(self._filesystem_path):
|
||||
if not entry.is_file():
|
||||
continue
|
||||
|
@ -42,13 +51,14 @@ class CollectionGetMixin:
|
|||
continue
|
||||
yield href
|
||||
|
||||
def _get(self, href, verify_href=True):
|
||||
def _get(self, href: str, verify_href: bool = True
|
||||
) -> Optional[radicale_item.Item]:
|
||||
if verify_href:
|
||||
try:
|
||||
if not pathutils.is_safe_filesystem_path_component(href):
|
||||
raise pathutils.UnsafePathError(href)
|
||||
path = pathutils.path_to_filesystem(
|
||||
self._filesystem_path, href)
|
||||
path = pathutils.path_to_filesystem(self._filesystem_path,
|
||||
href)
|
||||
except ValueError as e:
|
||||
logger.debug(
|
||||
"Can't translate name %r safely to filesystem in %r: %s",
|
||||
|
@ -63,36 +73,33 @@ class CollectionGetMixin:
|
|||
return None
|
||||
except PermissionError:
|
||||
# Windows raises ``PermissionError`` when ``path`` is a directory
|
||||
if (os.name == "nt" and
|
||||
if (sys.platform == "win32" and
|
||||
os.path.isdir(path) and os.access(path, os.R_OK)):
|
||||
return None
|
||||
raise
|
||||
# The hash of the component in the file system. This is used to check,
|
||||
# if the entry in the cache is still valid.
|
||||
input_hash = self._item_cache_hash(raw_text)
|
||||
cache_hash, uid, etag, text, name, tag, start, end = \
|
||||
self._load_item_cache(href, input_hash)
|
||||
if input_hash != cache_hash:
|
||||
cache_hash = self._item_cache_hash(raw_text)
|
||||
cache_content = self._load_item_cache(href, cache_hash)
|
||||
if cache_content is None:
|
||||
with self._acquire_cache_lock("item"):
|
||||
# Lock the item cache to prevent multpile processes from
|
||||
# generating the same data in parallel.
|
||||
# This improves the performance for multiple requests.
|
||||
if self._storage._lock.locked == "r":
|
||||
# Check if another process created the file in the meantime
|
||||
cache_hash, uid, etag, text, name, tag, start, end = \
|
||||
self._load_item_cache(href, input_hash)
|
||||
if input_hash != cache_hash:
|
||||
cache_content = self._load_item_cache(href, cache_hash)
|
||||
if cache_content is None:
|
||||
try:
|
||||
vobject_items = tuple(vobject.readComponents(
|
||||
raw_text.decode(self._encoding)))
|
||||
vobject_items = radicale_item.read_components(
|
||||
raw_text.decode(self._encoding))
|
||||
radicale_item.check_and_sanitize_items(
|
||||
vobject_items, tag=self.get_meta("tag"))
|
||||
vobject_items, tag=self.tag)
|
||||
vobject_item, = vobject_items
|
||||
temp_item = radicale_item.Item(
|
||||
collection=self, vobject_item=vobject_item)
|
||||
cache_hash, uid, etag, text, name, tag, start, end = \
|
||||
self._store_item_cache(
|
||||
href, temp_item, input_hash)
|
||||
cache_content = self._store_item_cache(
|
||||
href, temp_item, cache_hash)
|
||||
except Exception as e:
|
||||
raise RuntimeError("Failed to load item %r in %r: %s" %
|
||||
(href, self.path, e)) from e
|
||||
|
@ -107,11 +114,14 @@ class CollectionGetMixin:
|
|||
# Don't keep reference to ``vobject_item``, because it requires a lot
|
||||
# of memory.
|
||||
return radicale_item.Item(
|
||||
collection=self, href=href, last_modified=last_modified, etag=etag,
|
||||
text=text, uid=uid, name=name, component_name=tag,
|
||||
time_range=(start, end))
|
||||
collection=self, href=href, last_modified=last_modified,
|
||||
etag=cache_content.etag, text=cache_content.text,
|
||||
uid=cache_content.uid, name=cache_content.name,
|
||||
component_name=cache_content.tag,
|
||||
time_range=(cache_content.start, cache_content.end))
|
||||
|
||||
def get_multi(self, hrefs):
|
||||
def get_multi(self, hrefs: Iterable[str]
|
||||
) -> Iterator[Tuple[str, Optional[radicale_item.Item]]]:
|
||||
# It's faster to check for file name collissions here, because
|
||||
# we only need to call os.listdir once.
|
||||
files = None
|
||||
|
@ -123,13 +133,16 @@ class CollectionGetMixin:
|
|||
path = os.path.join(self._filesystem_path, href)
|
||||
if (not pathutils.is_safe_filesystem_path_component(href) or
|
||||
href not in files and os.path.lexists(path)):
|
||||
logger.debug(
|
||||
"Can't translate name safely to filesystem: %r", href)
|
||||
logger.debug("Can't translate name safely to filesystem: %r",
|
||||
href)
|
||||
yield (href, None)
|
||||
else:
|
||||
yield (href, self._get(href, verify_href=False))
|
||||
|
||||
def get_all(self):
|
||||
# We don't need to check for collissions, because the the file names
|
||||
# are from os.listdir.
|
||||
return (self._get(href, verify_href=False) for href in self._list())
|
||||
def get_all(self) -> Iterator[radicale_item.Item]:
|
||||
for href in self._list():
|
||||
# We don't need to check for collissions, because the file names
|
||||
# are from os.listdir.
|
||||
item = self._get(href, verify_href=False)
|
||||
if item is not None:
|
||||
yield item
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2019 Unrud <unrud@outlook.com>
|
||||
|
@ -17,15 +17,28 @@
|
|||
# along with Radicale. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import binascii
|
||||
import contextlib
|
||||
import os
|
||||
import pickle
|
||||
from typing import BinaryIO, Optional, cast
|
||||
|
||||
from radicale import item as radicale_item
|
||||
import radicale.item as radicale_item
|
||||
from radicale import pathutils
|
||||
from radicale.log import logger
|
||||
from radicale.storage import multifilesystem
|
||||
from radicale.storage.multifilesystem.base import CollectionBase
|
||||
|
||||
|
||||
class CollectionHistoryMixin:
|
||||
class CollectionPartHistory(CollectionBase):
|
||||
|
||||
_max_sync_token_age: int
|
||||
|
||||
def __init__(self, storage_: "multifilesystem.Storage", path: str,
|
||||
filesystem_path: Optional[str] = None) -> None:
|
||||
super().__init__(storage_, path, filesystem_path)
|
||||
self._max_sync_token_age = storage_.configuration.get(
|
||||
"storage", "max_sync_token_age")
|
||||
|
||||
def _update_history_etag(self, href, item):
|
||||
"""Updates and retrieves the history etag from the history cache.
|
||||
|
||||
|
@ -53,13 +66,11 @@ class CollectionHistoryMixin:
|
|||
self._storage._makedirs_synced(history_folder)
|
||||
history_etag = radicale_item.get_etag(
|
||||
history_etag + "/" + etag).strip("\"")
|
||||
try:
|
||||
# Race: Other processes might have created and locked the file.
|
||||
with self._atomic_write(os.path.join(history_folder, href),
|
||||
"wb") as f:
|
||||
pickle.dump([etag, history_etag], f)
|
||||
except PermissionError:
|
||||
pass
|
||||
# Race: Other processes might have created and locked the file.
|
||||
with contextlib.suppress(PermissionError), self._atomic_write(
|
||||
os.path.join(history_folder, href), "wb") as fo:
|
||||
fb = cast(BinaryIO, fo)
|
||||
pickle.dump([etag, history_etag], fb)
|
||||
return history_etag
|
||||
|
||||
def _get_deleted_history_hrefs(self):
|
||||
|
@ -67,7 +78,7 @@ class CollectionHistoryMixin:
|
|||
history cache."""
|
||||
history_folder = os.path.join(self._filesystem_path,
|
||||
".Radicale.cache", "history")
|
||||
try:
|
||||
with contextlib.suppress(FileNotFoundError):
|
||||
for entry in os.scandir(history_folder):
|
||||
href = entry.name
|
||||
if not pathutils.is_safe_filesystem_path_component(href):
|
||||
|
@ -75,13 +86,10 @@ class CollectionHistoryMixin:
|
|||
if os.path.isfile(os.path.join(self._filesystem_path, href)):
|
||||
continue
|
||||
yield href
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
|
||||
def _clean_history(self):
|
||||
# Delete all expired history entries of deleted items.
|
||||
history_folder = os.path.join(self._filesystem_path,
|
||||
".Radicale.cache", "history")
|
||||
self._clean_cache(history_folder, self._get_deleted_history_hrefs(),
|
||||
max_age=self._storage.configuration.get(
|
||||
"storage", "max_sync_token_age"))
|
||||
max_age=self._max_sync_token_age)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2019 Unrud <unrud@outlook.com>
|
||||
|
@ -20,52 +20,82 @@ import contextlib
|
|||
import logging
|
||||
import os
|
||||
import shlex
|
||||
import signal
|
||||
import subprocess
|
||||
import sys
|
||||
from typing import Iterator
|
||||
|
||||
from radicale import pathutils
|
||||
from radicale import config, pathutils, types
|
||||
from radicale.log import logger
|
||||
from radicale.storage.multifilesystem.base import CollectionBase, StorageBase
|
||||
|
||||
|
||||
class CollectionLockMixin:
|
||||
def _acquire_cache_lock(self, ns=""):
|
||||
class CollectionPartLock(CollectionBase):
|
||||
|
||||
@types.contextmanager
|
||||
def _acquire_cache_lock(self, ns: str = "") -> Iterator[None]:
|
||||
if self._storage._lock.locked == "w":
|
||||
return contextlib.ExitStack()
|
||||
yield
|
||||
return
|
||||
cache_folder = os.path.join(self._filesystem_path, ".Radicale.cache")
|
||||
self._storage._makedirs_synced(cache_folder)
|
||||
lock_path = os.path.join(cache_folder,
|
||||
".Radicale.lock" + (".%s" % ns if ns else ""))
|
||||
lock = pathutils.RwLock(lock_path)
|
||||
return lock.acquire("w")
|
||||
with lock.acquire("w"):
|
||||
yield
|
||||
|
||||
|
||||
class StorageLockMixin:
|
||||
class StoragePartLock(StorageBase):
|
||||
|
||||
def __init__(self, configuration):
|
||||
_lock: pathutils.RwLock
|
||||
_hook: str
|
||||
|
||||
def __init__(self, configuration: config.Configuration) -> None:
|
||||
super().__init__(configuration)
|
||||
folder = self.configuration.get("storage", "filesystem_folder")
|
||||
lock_path = os.path.join(folder, ".Radicale.lock")
|
||||
lock_path = os.path.join(self._filesystem_folder, ".Radicale.lock")
|
||||
self._lock = pathutils.RwLock(lock_path)
|
||||
self._hook = configuration.get("storage", "hook")
|
||||
|
||||
@contextlib.contextmanager
|
||||
def acquire_lock(self, mode, user=None):
|
||||
@types.contextmanager
|
||||
def acquire_lock(self, mode: str, user: str = "") -> Iterator[None]:
|
||||
with self._lock.acquire(mode):
|
||||
yield
|
||||
# execute hook
|
||||
hook = self.configuration.get("storage", "hook")
|
||||
if mode == "w" and hook:
|
||||
folder = self.configuration.get("storage", "filesystem_folder")
|
||||
logger.debug("Running hook")
|
||||
if mode == "w" and self._hook:
|
||||
debug = logger.isEnabledFor(logging.DEBUG)
|
||||
# Use new process group for child to prevent terminals
|
||||
# from sending SIGINT etc.
|
||||
preexec_fn = None
|
||||
creationflags = 0
|
||||
if sys.platform == "win32":
|
||||
creationflags |= subprocess.CREATE_NEW_PROCESS_GROUP
|
||||
else:
|
||||
# Process group is also used to identify child processes
|
||||
preexec_fn = os.setpgrp
|
||||
command = self._hook % {
|
||||
"user": shlex.quote(user or "Anonymous")}
|
||||
logger.debug("Running storage hook")
|
||||
p = subprocess.Popen(
|
||||
hook % {"user": shlex.quote(user or "Anonymous")},
|
||||
stdin=subprocess.DEVNULL,
|
||||
command, stdin=subprocess.DEVNULL,
|
||||
stdout=subprocess.PIPE if debug else subprocess.DEVNULL,
|
||||
stderr=subprocess.PIPE if debug else subprocess.DEVNULL,
|
||||
shell=True, universal_newlines=True, cwd=folder)
|
||||
stdout_data, stderr_data = p.communicate()
|
||||
shell=True, universal_newlines=True, preexec_fn=preexec_fn,
|
||||
cwd=self._filesystem_folder, creationflags=creationflags)
|
||||
try:
|
||||
stdout_data, stderr_data = p.communicate()
|
||||
except BaseException: # e.g. KeyboardInterrupt or SystemExit
|
||||
p.kill()
|
||||
p.wait()
|
||||
raise
|
||||
finally:
|
||||
if sys.platform != "win32":
|
||||
# Kill remaining children identified by process group
|
||||
with contextlib.suppress(OSError):
|
||||
os.killpg(p.pid, signal.SIGKILL)
|
||||
if stdout_data:
|
||||
logger.debug("Captured stdout hook:\n%s", stdout_data)
|
||||
logger.debug("Captured stdout from hook:\n%s", stdout_data)
|
||||
if stderr_data:
|
||||
logger.debug("Captured stderr hook:\n%s", stderr_data)
|
||||
logger.debug("Captured stderr from hook:\n%s", stderr_data)
|
||||
if p.returncode != 0:
|
||||
raise subprocess.CalledProcessError(p.returncode, p.args)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
|
@ -18,32 +18,49 @@
|
|||
|
||||
import json
|
||||
import os
|
||||
from typing import Mapping, Optional, TextIO, Union, cast, overload
|
||||
|
||||
from radicale import item as radicale_item
|
||||
import radicale.item as radicale_item
|
||||
from radicale.storage import multifilesystem
|
||||
from radicale.storage.multifilesystem.base import CollectionBase
|
||||
|
||||
|
||||
class CollectionMetaMixin:
|
||||
def __init__(self):
|
||||
super().__init__()
|
||||
class CollectionPartMeta(CollectionBase):
|
||||
|
||||
_meta_cache: Optional[Mapping[str, str]]
|
||||
_props_path: str
|
||||
|
||||
def __init__(self, storage_: "multifilesystem.Storage", path: str,
|
||||
filesystem_path: Optional[str] = None) -> None:
|
||||
super().__init__(storage_, path, filesystem_path)
|
||||
self._meta_cache = None
|
||||
self._props_path = os.path.join(
|
||||
self._filesystem_path, ".Radicale.props")
|
||||
|
||||
def get_meta(self, key=None):
|
||||
@overload
|
||||
def get_meta(self, key: None = None) -> Mapping[str, str]: ...
|
||||
|
||||
@overload
|
||||
def get_meta(self, key: str) -> Optional[str]: ...
|
||||
|
||||
def get_meta(self, key: Optional[str] = None) -> Union[Mapping[str, str],
|
||||
Optional[str]]:
|
||||
# reuse cached value if the storage is read-only
|
||||
if self._storage._lock.locked == "w" or self._meta_cache is None:
|
||||
try:
|
||||
try:
|
||||
with open(self._props_path, encoding=self._encoding) as f:
|
||||
self._meta_cache = json.load(f)
|
||||
temp_meta = json.load(f)
|
||||
except FileNotFoundError:
|
||||
self._meta_cache = {}
|
||||
radicale_item.check_and_sanitize_props(self._meta_cache)
|
||||
temp_meta = {}
|
||||
self._meta_cache = radicale_item.check_and_sanitize_props(
|
||||
temp_meta)
|
||||
except ValueError as e:
|
||||
raise RuntimeError("Failed to load properties of collection "
|
||||
"%r: %s" % (self.path, e)) from e
|
||||
return self._meta_cache.get(key) if key else self._meta_cache
|
||||
return self._meta_cache if key is None else self._meta_cache.get(key)
|
||||
|
||||
def set_meta(self, props):
|
||||
with self._atomic_write(self._props_path, "w") as f:
|
||||
def set_meta(self, props: Mapping[str, str]) -> None:
|
||||
with self._atomic_write(self._props_path, "w") as fo:
|
||||
f = cast(TextIO, fo)
|
||||
json.dump(props, f, sort_keys=True)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
|
@ -18,19 +18,25 @@
|
|||
|
||||
import os
|
||||
|
||||
from radicale import pathutils
|
||||
from radicale import item as radicale_item
|
||||
from radicale import pathutils, storage
|
||||
from radicale.storage import multifilesystem
|
||||
from radicale.storage.multifilesystem.base import StorageBase
|
||||
|
||||
|
||||
class StorageMoveMixin:
|
||||
class StoragePartMove(StorageBase):
|
||||
|
||||
def move(self, item, to_collection, to_href):
|
||||
def move(self, item: radicale_item.Item,
|
||||
to_collection: storage.BaseCollection, to_href: str) -> None:
|
||||
if not pathutils.is_safe_filesystem_path_component(to_href):
|
||||
raise pathutils.UnsafePathError(to_href)
|
||||
os.replace(
|
||||
pathutils.path_to_filesystem(
|
||||
item.collection._filesystem_path, item.href),
|
||||
pathutils.path_to_filesystem(
|
||||
to_collection._filesystem_path, to_href))
|
||||
assert isinstance(to_collection, multifilesystem.Collection)
|
||||
assert isinstance(item.collection, multifilesystem.Collection)
|
||||
assert item.href
|
||||
os.replace(pathutils.path_to_filesystem(
|
||||
item.collection._filesystem_path, item.href),
|
||||
pathutils.path_to_filesystem(
|
||||
to_collection._filesystem_path, to_href))
|
||||
self._sync_directory(to_collection._filesystem_path)
|
||||
if item.collection._filesystem_path != to_collection._filesystem_path:
|
||||
self._sync_directory(item.collection._filesystem_path)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2019 Unrud <unrud@outlook.com>
|
||||
|
@ -16,20 +16,27 @@
|
|||
# You should have received a copy of the GNU General Public License
|
||||
# along with Radicale. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import contextlib
|
||||
import itertools
|
||||
import os
|
||||
import pickle
|
||||
from hashlib import sha256
|
||||
from typing import BinaryIO, Iterable, Tuple, cast
|
||||
|
||||
from radicale.log import logger
|
||||
from radicale.storage.multifilesystem.base import CollectionBase
|
||||
from radicale.storage.multifilesystem.cache import CollectionPartCache
|
||||
from radicale.storage.multifilesystem.history import CollectionPartHistory
|
||||
|
||||
|
||||
class CollectionSyncMixin:
|
||||
def sync(self, old_token=None):
|
||||
class CollectionPartSync(CollectionPartCache, CollectionPartHistory,
|
||||
CollectionBase):
|
||||
|
||||
def sync(self, old_token: str = "") -> Tuple[str, Iterable[str]]:
|
||||
# The sync token has the form http://radicale.org/ns/sync/TOKEN_NAME
|
||||
# where TOKEN_NAME is the sha256 hash of all history etags of present
|
||||
# and past items of the collection.
|
||||
def check_token_name(token_name):
|
||||
def check_token_name(token_name: str) -> bool:
|
||||
if len(token_name) != 64:
|
||||
return False
|
||||
for c in token_name:
|
||||
|
@ -37,7 +44,7 @@ class CollectionSyncMixin:
|
|||
return False
|
||||
return True
|
||||
|
||||
old_token_name = None
|
||||
old_token_name = ""
|
||||
if old_token:
|
||||
# Extract the token name from the sync token
|
||||
if not old_token.startswith("http://radicale.org/ns/sync/"):
|
||||
|
@ -78,10 +85,9 @@ class CollectionSyncMixin:
|
|||
"Failed to load stored sync token %r in %r: %s",
|
||||
old_token_name, self.path, e, exc_info=True)
|
||||
# Delete the damaged file
|
||||
try:
|
||||
with contextlib.suppress(FileNotFoundError,
|
||||
PermissionError):
|
||||
os.remove(old_token_path)
|
||||
except (FileNotFoundError, PermissionError):
|
||||
pass
|
||||
raise ValueError("Token not found: %r" % old_token)
|
||||
# write the new token state or update the modification time of
|
||||
# existing token state
|
||||
|
@ -89,23 +95,21 @@ class CollectionSyncMixin:
|
|||
self._storage._makedirs_synced(token_folder)
|
||||
try:
|
||||
# Race: Other processes might have created and locked the file.
|
||||
with self._atomic_write(token_path, "wb") as f:
|
||||
pickle.dump(state, f)
|
||||
with self._atomic_write(token_path, "wb") as fo:
|
||||
fb = cast(BinaryIO, fo)
|
||||
pickle.dump(state, fb)
|
||||
except PermissionError:
|
||||
pass
|
||||
else:
|
||||
# clean up old sync tokens and item cache
|
||||
self._clean_cache(token_folder, os.listdir(token_folder),
|
||||
max_age=self._storage.configuration.get(
|
||||
"storage", "max_sync_token_age"))
|
||||
max_age=self._max_sync_token_age)
|
||||
self._clean_history()
|
||||
else:
|
||||
# Try to update the modification time
|
||||
try:
|
||||
with contextlib.suppress(FileNotFoundError):
|
||||
# Race: Another process might have deleted the file.
|
||||
os.utime(token_path)
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
changes = []
|
||||
# Find all new, changed and deleted (that are still in the item cache)
|
||||
# items
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
|
@ -16,15 +16,25 @@
|
|||
# You should have received a copy of the GNU General Public License
|
||||
# along with Radicale. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import errno
|
||||
import os
|
||||
import pickle
|
||||
import sys
|
||||
from typing import Iterable, Iterator, TextIO, cast
|
||||
|
||||
from radicale import item as radicale_item
|
||||
import radicale.item as radicale_item
|
||||
from radicale import pathutils
|
||||
from radicale.storage.multifilesystem.base import CollectionBase
|
||||
from radicale.storage.multifilesystem.cache import CollectionPartCache
|
||||
from radicale.storage.multifilesystem.get import CollectionPartGet
|
||||
from radicale.storage.multifilesystem.history import CollectionPartHistory
|
||||
|
||||
|
||||
class CollectionUploadMixin:
|
||||
def upload(self, href, item):
|
||||
class CollectionPartUpload(CollectionPartGet, CollectionPartCache,
|
||||
CollectionPartHistory, CollectionBase):
|
||||
|
||||
def upload(self, href: str, item: radicale_item.Item
|
||||
) -> radicale_item.Item:
|
||||
if not pathutils.is_safe_filesystem_path_component(href):
|
||||
raise pathutils.UnsafePathError(href)
|
||||
try:
|
||||
|
@ -33,27 +43,40 @@ class CollectionUploadMixin:
|
|||
raise ValueError("Failed to store item %r in collection %r: %s" %
|
||||
(href, self.path, e)) from e
|
||||
path = pathutils.path_to_filesystem(self._filesystem_path, href)
|
||||
with self._atomic_write(path, newline="") as fd:
|
||||
fd.write(item.serialize())
|
||||
with self._atomic_write(path, newline="") as fo:
|
||||
f = cast(TextIO, fo)
|
||||
f.write(item.serialize())
|
||||
# Clean the cache after the actual item is stored, or the cache entry
|
||||
# will be removed again.
|
||||
self._clean_item_cache()
|
||||
# Track the change
|
||||
self._update_history_etag(href, item)
|
||||
self._clean_history()
|
||||
return self._get(href, verify_href=False)
|
||||
uploaded_item = self._get(href, verify_href=False)
|
||||
if uploaded_item is None:
|
||||
raise RuntimeError("Storage modified externally")
|
||||
return uploaded_item
|
||||
|
||||
def _upload_all_nonatomic(self, items, suffix=""):
|
||||
"""Upload a new set of items.
|
||||
def _upload_all_nonatomic(self, items: Iterable[radicale_item.Item],
|
||||
suffix: str = "") -> None:
|
||||
"""Upload a new set of items non-atomic"""
|
||||
def is_safe_free_href(href: str) -> bool:
|
||||
return (pathutils.is_safe_filesystem_path_component(href) and
|
||||
not os.path.lexists(
|
||||
os.path.join(self._filesystem_path, href)))
|
||||
|
||||
This takes a list of vobject items and
|
||||
uploads them nonatomic and without existence checks.
|
||||
def get_safe_free_hrefs(uid: str) -> Iterator[str]:
|
||||
for href in [uid if uid.lower().endswith(suffix.lower())
|
||||
else uid + suffix,
|
||||
radicale_item.get_etag(uid).strip('"') + suffix]:
|
||||
if is_safe_free_href(href):
|
||||
yield href
|
||||
yield radicale_item.find_available_uid(
|
||||
lambda href: not is_safe_free_href(href), suffix)
|
||||
|
||||
"""
|
||||
cache_folder = os.path.join(self._filesystem_path,
|
||||
".Radicale.cache", "item")
|
||||
self._storage._makedirs_synced(cache_folder)
|
||||
hrefs = set()
|
||||
for item in items:
|
||||
uid = item.uid
|
||||
try:
|
||||
|
@ -62,43 +85,27 @@ class CollectionUploadMixin:
|
|||
raise ValueError(
|
||||
"Failed to store item %r in temporary collection %r: %s" %
|
||||
(uid, self.path, e)) from e
|
||||
href_candidate_funtions = []
|
||||
if os.name in ("nt", "posix"):
|
||||
href_candidate_funtions.append(
|
||||
lambda: uid if uid.lower().endswith(suffix.lower())
|
||||
else uid + suffix)
|
||||
href_candidate_funtions.extend((
|
||||
lambda: radicale_item.get_etag(uid).strip('"') + suffix,
|
||||
lambda: radicale_item.find_available_uid(hrefs.__contains__,
|
||||
suffix)))
|
||||
href = f = None
|
||||
while href_candidate_funtions:
|
||||
href = href_candidate_funtions.pop(0)()
|
||||
if href in hrefs:
|
||||
continue
|
||||
if not pathutils.is_safe_filesystem_path_component(href):
|
||||
if not href_candidate_funtions:
|
||||
raise pathutils.UnsafePathError(href)
|
||||
continue
|
||||
for href in get_safe_free_hrefs(uid):
|
||||
try:
|
||||
f = open(pathutils.path_to_filesystem(
|
||||
self._filesystem_path, href),
|
||||
"w", newline="", encoding=self._encoding)
|
||||
break
|
||||
f = open(os.path.join(self._filesystem_path, href),
|
||||
"w", newline="", encoding=self._encoding)
|
||||
except OSError as e:
|
||||
if href_candidate_funtions and (
|
||||
os.name == "posix" and e.errno == 22 or
|
||||
os.name == "nt" and e.errno == 123):
|
||||
if (sys.platform != "win32" and e.errno == errno.EINVAL or
|
||||
sys.platform == "win32" and e.errno == 123):
|
||||
# not a valid filename
|
||||
continue
|
||||
raise
|
||||
break
|
||||
else:
|
||||
raise RuntimeError("No href found for item %r in temporary "
|
||||
"collection %r" % (uid, self.path))
|
||||
with f:
|
||||
f.write(item.serialize())
|
||||
f.flush()
|
||||
self._storage._fsync(f)
|
||||
hrefs.add(href)
|
||||
with open(os.path.join(cache_folder, href), "wb") as f:
|
||||
pickle.dump(cache_content, f)
|
||||
f.flush()
|
||||
self._storage._fsync(f)
|
||||
with open(os.path.join(cache_folder, href), "wb") as fb:
|
||||
pickle.dump(cache_content, fb)
|
||||
fb.flush()
|
||||
self._storage._fsync(fb)
|
||||
self._storage._sync_directory(cache_folder)
|
||||
self._storage._sync_directory(self._filesystem_path)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
|
@ -16,23 +16,27 @@
|
|||
# You should have received a copy of the GNU General Public License
|
||||
# along with Radicale. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import contextlib
|
||||
from typing import Iterator, Optional, Set
|
||||
|
||||
from radicale import pathutils, storage
|
||||
from radicale import pathutils, storage, types
|
||||
from radicale.log import logger
|
||||
from radicale.storage.multifilesystem.base import StorageBase
|
||||
from radicale.storage.multifilesystem.discover import StoragePartDiscover
|
||||
|
||||
|
||||
class StorageVerifyMixin:
|
||||
def verify(self):
|
||||
class StoragePartVerify(StoragePartDiscover, StorageBase):
|
||||
|
||||
def verify(self) -> bool:
|
||||
item_errors = collection_errors = 0
|
||||
|
||||
@contextlib.contextmanager
|
||||
def exception_cm(sane_path, href=None):
|
||||
@types.contextmanager
|
||||
def exception_cm(sane_path: str, href: Optional[str]
|
||||
) -> Iterator[None]:
|
||||
nonlocal item_errors, collection_errors
|
||||
try:
|
||||
yield
|
||||
except Exception as e:
|
||||
if href:
|
||||
if href is not None:
|
||||
item_errors += 1
|
||||
name = "item %r in %r" % (href, sane_path)
|
||||
else:
|
||||
|
@ -45,13 +49,14 @@ class StorageVerifyMixin:
|
|||
sane_path = remaining_sane_paths.pop(0)
|
||||
path = pathutils.unstrip_path(sane_path, True)
|
||||
logger.debug("Verifying collection %r", sane_path)
|
||||
with exception_cm(sane_path):
|
||||
with exception_cm(sane_path, None):
|
||||
saved_item_errors = item_errors
|
||||
collection = None
|
||||
uids = set()
|
||||
collection: Optional[storage.BaseCollection] = None
|
||||
uids: Set[str] = set()
|
||||
has_child_collections = False
|
||||
for item in self.discover(path, "1", exception_cm):
|
||||
if not collection:
|
||||
assert isinstance(item, storage.BaseCollection)
|
||||
collection = item
|
||||
collection.get_meta()
|
||||
continue
|
||||
|
@ -65,10 +70,11 @@ class StorageVerifyMixin:
|
|||
uids.add(item.uid)
|
||||
logger.debug("Verified item %r in %r",
|
||||
item.href, sane_path)
|
||||
assert collection
|
||||
if item_errors == saved_item_errors:
|
||||
collection.sync()
|
||||
if has_child_collections and collection.get_meta("tag"):
|
||||
if has_child_collections and collection.tag:
|
||||
logger.error("Invalid collection %r: %r must not have "
|
||||
"child collections", sane_path,
|
||||
collection.get_meta("tag"))
|
||||
collection.tag)
|
||||
return item_errors == 0 and collection_errors == 0
|
||||
|
|
114
radicale/storage/multifilesystem_nolock.py
Normal file
114
radicale/storage/multifilesystem_nolock.py
Normal file
|
@ -0,0 +1,114 @@
|
|||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2021 Unrud <unrud@outlook.com>
|
||||
#
|
||||
# This library is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with Radicale. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
"""
|
||||
The multifilesystem backend without file-based locking.
|
||||
"""
|
||||
|
||||
import threading
|
||||
from collections import deque
|
||||
from typing import ClassVar, Deque, Dict, Hashable, Iterator, Type
|
||||
|
||||
from radicale import config, pathutils, types
|
||||
from radicale.storage import multifilesystem
|
||||
|
||||
|
||||
class RwLock(pathutils.RwLock):
|
||||
|
||||
_cond: threading.Condition
|
||||
|
||||
def __init__(self) -> None:
|
||||
super().__init__("")
|
||||
self._cond = threading.Condition(self._lock)
|
||||
|
||||
@types.contextmanager
|
||||
def acquire(self, mode: str, user: str = "") -> Iterator[None]:
|
||||
if mode not in "rw":
|
||||
raise ValueError("Invalid mode: %r" % mode)
|
||||
with self._cond:
|
||||
self._cond.wait_for(lambda: not self._writer and (
|
||||
mode == "r" or self._readers == 0))
|
||||
if mode == "r":
|
||||
self._readers += 1
|
||||
else:
|
||||
self._writer = True
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
with self._cond:
|
||||
if mode == "r":
|
||||
self._readers -= 1
|
||||
self._writer = False
|
||||
if self._readers == 0:
|
||||
self._cond.notify_all()
|
||||
|
||||
|
||||
class LockDict:
|
||||
|
||||
_lock: threading.Lock
|
||||
_dict: Dict[Hashable, Deque[threading.Lock]]
|
||||
|
||||
def __init__(self) -> None:
|
||||
self._lock = threading.Lock()
|
||||
self._dict = {}
|
||||
|
||||
@types.contextmanager
|
||||
def acquire(self, key: Hashable) -> Iterator[None]:
|
||||
with self._lock:
|
||||
waiters = self._dict.get(key)
|
||||
if waiters is None:
|
||||
self._dict[key] = waiters = deque()
|
||||
wait = bool(waiters)
|
||||
waiter = threading.Lock()
|
||||
waiter.acquire()
|
||||
waiters.append(waiter)
|
||||
if wait:
|
||||
waiter.acquire()
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
with self._lock:
|
||||
assert waiters[0] is waiter and self._dict[key] is waiters
|
||||
del waiters[0]
|
||||
if waiters:
|
||||
waiters[0].release()
|
||||
else:
|
||||
del self._dict[key]
|
||||
|
||||
|
||||
class Collection(multifilesystem.Collection):
|
||||
|
||||
_storage: "Storage"
|
||||
|
||||
@types.contextmanager
|
||||
def _acquire_cache_lock(self, ns: str = "") -> Iterator[None]:
|
||||
if self._storage._lock.locked == "w":
|
||||
yield
|
||||
return
|
||||
with self._storage._cache_lock.acquire((self.path, ns)):
|
||||
yield
|
||||
|
||||
|
||||
class Storage(multifilesystem.Storage):
|
||||
|
||||
_collection_class: ClassVar[Type[Collection]] = Collection
|
||||
|
||||
_cache_lock: LockDict
|
||||
|
||||
def __init__(self, configuration: config.Configuration) -> None:
|
||||
super().__init__(configuration)
|
||||
self._lock = RwLock()
|
||||
self._cache_lock = LockDict()
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
#
|
||||
|
@ -22,13 +22,19 @@ Tests for Radicale.
|
|||
|
||||
import base64
|
||||
import logging
|
||||
import shutil
|
||||
import sys
|
||||
import tempfile
|
||||
import xml.etree.ElementTree as ET
|
||||
from io import BytesIO
|
||||
from typing import Any, Dict, List, Optional, Tuple, Union
|
||||
|
||||
import defusedxml.ElementTree as DefusedET
|
||||
|
||||
import radicale
|
||||
from radicale import xmlutils
|
||||
from radicale import app, config, types, xmlutils
|
||||
|
||||
RESPONSES = Dict[str, Union[int, Dict[str, Tuple[int, ET.Element]]]]
|
||||
|
||||
# Enable debug output
|
||||
radicale.log.logger.setLevel(logging.DEBUG)
|
||||
|
@ -37,50 +43,84 @@ radicale.log.logger.setLevel(logging.DEBUG)
|
|||
class BaseTest:
|
||||
"""Base class for tests."""
|
||||
|
||||
def request(self, method, path, data=None, login=None, **args):
|
||||
colpath: str
|
||||
configuration: config.Configuration
|
||||
application: app.Application
|
||||
|
||||
def setup(self) -> None:
|
||||
self.configuration = config.load()
|
||||
self.colpath = tempfile.mkdtemp()
|
||||
self.configure({
|
||||
"storage": {"filesystem_folder": self.colpath,
|
||||
# Disable syncing to disk for better performance
|
||||
"_filesystem_fsync": "False"},
|
||||
# Set incorrect authentication delay to a short duration
|
||||
"auth": {"delay": "0.001"}})
|
||||
|
||||
def configure(self, config_: types.CONFIG) -> None:
|
||||
self.configuration.update(config_, "test", privileged=True)
|
||||
self.application = app.Application(self.configuration)
|
||||
|
||||
def teardown(self) -> None:
|
||||
shutil.rmtree(self.colpath)
|
||||
|
||||
def request(self, method: str, path: str, data: Optional[str] = None,
|
||||
check: Optional[int] = None, **kwargs
|
||||
) -> Tuple[int, Dict[str, str], str]:
|
||||
"""Send a request."""
|
||||
for key in args:
|
||||
args[key.upper()] = args[key]
|
||||
login = kwargs.pop("login", None)
|
||||
if login is not None and not isinstance(login, str):
|
||||
raise TypeError("login argument must be %r, not %r" %
|
||||
(str, type(login)))
|
||||
environ: Dict[str, Any] = {k.upper(): v for k, v in kwargs.items()}
|
||||
for k, v in environ.items():
|
||||
if not isinstance(v, str):
|
||||
raise TypeError("type of %r is %r, expected %r" %
|
||||
(k, type(v), str))
|
||||
encoding: str = self.configuration.get("encoding", "request")
|
||||
if login:
|
||||
args["HTTP_AUTHORIZATION"] = "Basic " + base64.b64encode(
|
||||
login.encode()).decode()
|
||||
args["REQUEST_METHOD"] = method.upper()
|
||||
args["PATH_INFO"] = path
|
||||
environ["HTTP_AUTHORIZATION"] = "Basic " + base64.b64encode(
|
||||
login.encode(encoding)).decode()
|
||||
environ["REQUEST_METHOD"] = method.upper()
|
||||
environ["PATH_INFO"] = path
|
||||
if data:
|
||||
data = data.encode()
|
||||
args["wsgi.input"] = BytesIO(data)
|
||||
args["CONTENT_LENGTH"] = str(len(data))
|
||||
args["wsgi.errors"] = sys.stderr
|
||||
data_bytes = data.encode(encoding)
|
||||
environ["wsgi.input"] = BytesIO(data_bytes)
|
||||
environ["CONTENT_LENGTH"] = str(len(data_bytes))
|
||||
environ["wsgi.errors"] = sys.stderr
|
||||
status = headers = None
|
||||
|
||||
def start_response(status_, headers_):
|
||||
def start_response(status_: str, headers_: List[Tuple[str, str]]
|
||||
) -> None:
|
||||
nonlocal status, headers
|
||||
status = status_
|
||||
headers = headers_
|
||||
answer = self.application(args, start_response)
|
||||
status = int(status_.split()[0])
|
||||
headers = dict(headers_)
|
||||
answers = list(self.application(environ, start_response))
|
||||
assert status is not None and headers is not None
|
||||
assert check is None or status == check, "%d != %d" % (status, check)
|
||||
|
||||
return (int(status.split()[0]), dict(headers),
|
||||
answer[0].decode() if answer else None)
|
||||
return status, headers, answers[0].decode() if answers else ""
|
||||
|
||||
@staticmethod
|
||||
def parse_responses(text):
|
||||
def parse_responses(text: str) -> RESPONSES:
|
||||
xml = DefusedET.fromstring(text)
|
||||
assert xml.tag == xmlutils.make_clark("D:multistatus")
|
||||
path_responses = {}
|
||||
path_responses: Dict[str, Union[
|
||||
int, Dict[str, Tuple[int, ET.Element]]]] = {}
|
||||
for response in xml.findall(xmlutils.make_clark("D:response")):
|
||||
href = response.find(xmlutils.make_clark("D:href"))
|
||||
assert href.text not in path_responses
|
||||
prop_respones = {}
|
||||
prop_respones: Dict[str, Tuple[int, ET.Element]] = {}
|
||||
for propstat in response.findall(
|
||||
xmlutils.make_clark("D:propstat")):
|
||||
status = propstat.find(xmlutils.make_clark("D:status"))
|
||||
assert status.text.startswith("HTTP/1.1 ")
|
||||
status_code = int(status.text.split(" ")[1])
|
||||
for prop in propstat.findall(xmlutils.make_clark("D:prop")):
|
||||
for element in prop:
|
||||
human_tag = xmlutils.make_human_tag(element.tag)
|
||||
assert human_tag not in prop_respones
|
||||
prop_respones[human_tag] = (status_code, element)
|
||||
for element in propstat.findall(
|
||||
"./%s/*" % xmlutils.make_clark("D:prop")):
|
||||
human_tag = xmlutils.make_human_tag(element.tag)
|
||||
assert human_tag not in prop_respones
|
||||
prop_respones[human_tag] = (status_code, element)
|
||||
status = response.find(xmlutils.make_clark("D:status"))
|
||||
if status is not None:
|
||||
assert not prop_respones
|
||||
|
@ -91,66 +131,84 @@ class BaseTest:
|
|||
path_responses[href.text] = prop_respones
|
||||
return path_responses
|
||||
|
||||
@staticmethod
|
||||
def _check_status(status, good_status, check=True):
|
||||
if check is True:
|
||||
assert status == good_status
|
||||
elif check is not False:
|
||||
assert status == check
|
||||
return status == good_status
|
||||
|
||||
def get(self, path, check=True, **args):
|
||||
status, _, answer = self.request("GET", path, **args)
|
||||
self._check_status(status, 200, check)
|
||||
def get(self, path: str, check: Optional[int] = 200, **kwargs
|
||||
) -> Tuple[int, str]:
|
||||
assert "data" not in kwargs
|
||||
status, _, answer = self.request("GET", path, check=check, **kwargs)
|
||||
return status, answer
|
||||
|
||||
def put(self, path, data, check=True, **args):
|
||||
status, _, answer = self.request("PUT", path, data, **args)
|
||||
self._check_status(status, 201, check)
|
||||
def post(self, path: str, data: str = None, check: Optional[int] = 200,
|
||||
**kwargs) -> Tuple[int, str]:
|
||||
status, _, answer = self.request("POST", path, data, check=check,
|
||||
**kwargs)
|
||||
return status, answer
|
||||
|
||||
def propfind(self, path, data=None, check=True, **args):
|
||||
status, _, answer = self.request("PROPFIND", path, data, **args)
|
||||
if not self._check_status(status, 207, check):
|
||||
return status, None
|
||||
def put(self, path: str, data: str, check: Optional[int] = 201,
|
||||
**kwargs) -> Tuple[int, str]:
|
||||
status, _, answer = self.request("PUT", path, data, check=check,
|
||||
**kwargs)
|
||||
return status, answer
|
||||
|
||||
def propfind(self, path: str, data: Optional[str] = None,
|
||||
check: Optional[int] = 207, **kwargs
|
||||
) -> Tuple[int, RESPONSES]:
|
||||
status, _, answer = self.request("PROPFIND", path, data, check=check,
|
||||
**kwargs)
|
||||
if status < 200 or 300 <= status:
|
||||
return status, {}
|
||||
assert answer is not None
|
||||
responses = self.parse_responses(answer)
|
||||
if args.get("HTTP_DEPTH", 0) == 0:
|
||||
if kwargs.get("HTTP_DEPTH", "0") == "0":
|
||||
assert len(responses) == 1 and path in responses
|
||||
return status, responses
|
||||
|
||||
def proppatch(self, path, data=None, check=True, **args):
|
||||
status, _, answer = self.request("PROPPATCH", path, data, **args)
|
||||
if not self._check_status(status, 207, check):
|
||||
return status, None
|
||||
def proppatch(self, path: str, data: Optional[str] = None,
|
||||
check: Optional[int] = 207, **kwargs
|
||||
) -> Tuple[int, RESPONSES]:
|
||||
status, _, answer = self.request("PROPPATCH", path, data, check=check,
|
||||
**kwargs)
|
||||
if status < 200 or 300 <= status:
|
||||
return status, {}
|
||||
assert answer is not None
|
||||
responses = self.parse_responses(answer)
|
||||
assert len(responses) == 1 and path in responses
|
||||
return status, responses
|
||||
|
||||
def report(self, path, data, check=True, **args):
|
||||
status, _, answer = self.request("REPORT", path, data, **args)
|
||||
if not self._check_status(status, 207, check):
|
||||
return status, None
|
||||
def report(self, path: str, data: str, check: Optional[int] = 207,
|
||||
**kwargs) -> Tuple[int, RESPONSES]:
|
||||
status, _, answer = self.request("REPORT", path, data, check=check,
|
||||
**kwargs)
|
||||
if status < 200 or 300 <= status:
|
||||
return status, {}
|
||||
assert answer is not None
|
||||
return status, self.parse_responses(answer)
|
||||
|
||||
def delete(self, path, check=True, **args):
|
||||
status, _, answer = self.request("DELETE", path, **args)
|
||||
if not self._check_status(status, 200, check):
|
||||
return status, None
|
||||
def delete(self, path: str, check: Optional[int] = 200, **kwargs
|
||||
) -> Tuple[int, RESPONSES]:
|
||||
assert "data" not in kwargs
|
||||
status, _, answer = self.request("DELETE", path, check=check, **kwargs)
|
||||
if status < 200 or 300 <= status:
|
||||
return status, {}
|
||||
assert answer is not None
|
||||
responses = self.parse_responses(answer)
|
||||
assert len(responses) == 1 and path in responses
|
||||
return status, responses
|
||||
|
||||
def mkcalendar(self, path, data=None, check=True, **args):
|
||||
status, _, answer = self.request("MKCALENDAR", path, data, **args)
|
||||
self._check_status(status, 201, check)
|
||||
def mkcalendar(self, path: str, data: Optional[str] = None,
|
||||
check: Optional[int] = 201, **kwargs
|
||||
) -> Tuple[int, str]:
|
||||
status, _, answer = self.request("MKCALENDAR", path, data, check=check,
|
||||
**kwargs)
|
||||
return status, answer
|
||||
|
||||
def mkcol(self, path, data=None, check=True, **args):
|
||||
status, _, _ = self.request("MKCOL", path, data, **args)
|
||||
self._check_status(status, 201, check)
|
||||
def mkcol(self, path: str, data: Optional[str] = None,
|
||||
check: Optional[int] = 201, **kwargs) -> int:
|
||||
status, _, _ = self.request("MKCOL", path, data, check=check, **kwargs)
|
||||
return status
|
||||
|
||||
def create_addressbook(self, path, check=True, **args):
|
||||
def create_addressbook(self, path: str, check: Optional[int] = 201,
|
||||
**kwargs) -> int:
|
||||
assert "data" not in kwargs
|
||||
return self.mkcol(path, """\
|
||||
<?xml version="1.0" encoding="UTF-8" ?>
|
||||
<create xmlns="DAV:" xmlns:CR="urn:ietf:params:xml:ns:carddav">
|
||||
|
@ -162,4 +220,4 @@ class BaseTest:
|
|||
</resourcetype>
|
||||
</prop>
|
||||
</set>
|
||||
</create>""", check=check, **args)
|
||||
</create>""", check=check, **kwargs)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -28,7 +28,8 @@ from radicale import auth
|
|||
|
||||
|
||||
class Auth(auth.BaseAuth):
|
||||
def login(self, login, password):
|
||||
|
||||
def login(self, login: str, password: str) -> str:
|
||||
if login == "tmp":
|
||||
return login
|
||||
return ""
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
#
|
||||
# This library is free software: you can redistribute it and/or modify
|
||||
|
@ -23,7 +23,8 @@ from radicale import pathutils, rights
|
|||
|
||||
|
||||
class Rights(rights.BaseRights):
|
||||
def authorization(self, user, path):
|
||||
|
||||
def authorization(self, user: str, path: str) -> str:
|
||||
sane_path = pathutils.strip_path(path)
|
||||
if sane_path not in ("tmp", "other"):
|
||||
return ""
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
#
|
||||
|
@ -27,8 +27,10 @@ from radicale.storage import BaseCollection, multifilesystem
|
|||
|
||||
|
||||
class Collection(multifilesystem.Collection):
|
||||
|
||||
sync = BaseCollection.sync
|
||||
|
||||
|
||||
class Storage(multifilesystem.Storage):
|
||||
|
||||
_collection_class = Collection
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
#
|
||||
# This library is free software: you can redistribute it and/or modify
|
||||
|
@ -21,9 +21,16 @@ Custom web plugin.
|
|||
|
||||
from http import client
|
||||
|
||||
from radicale import web
|
||||
from radicale import httputils, types, web
|
||||
|
||||
|
||||
class Web(web.BaseWeb):
|
||||
def get(self, environ, base_prefix, path, user):
|
||||
|
||||
def get(self, environ: types.WSGIEnviron, base_prefix: str, path: str,
|
||||
user: str) -> types.WSGIResponse:
|
||||
return client.OK, {"Content-Type": "text/plain"}, "custom"
|
||||
|
||||
def post(self, environ: types.WSGIEnviron, base_prefix: str, path: str,
|
||||
user: str) -> types.WSGIResponse:
|
||||
content = httputils.read_request_body(self.configuration, environ)
|
||||
return client.OK, {"Content-Type": "text/plain"}, "echo:" + content
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2017 Guillaume Ayoub
|
||||
|
@ -26,22 +26,21 @@ This module offers helpers to use in tests.
|
|||
|
||||
import os
|
||||
|
||||
EXAMPLES_FOLDER = os.path.join(os.path.dirname(__file__), "static")
|
||||
from radicale import config, types
|
||||
|
||||
EXAMPLES_FOLDER: str = os.path.join(os.path.dirname(__file__), "static")
|
||||
|
||||
|
||||
def get_file_path(file_name):
|
||||
def get_file_path(file_name: str) -> str:
|
||||
return os.path.join(EXAMPLES_FOLDER, file_name)
|
||||
|
||||
|
||||
def get_file_content(file_name):
|
||||
try:
|
||||
with open(get_file_path(file_name), encoding="utf-8") as fd:
|
||||
return fd.read()
|
||||
except IOError:
|
||||
print("Couldn't open the file %s" % file_name)
|
||||
def get_file_content(file_name: str) -> str:
|
||||
with open(get_file_path(file_name), encoding="utf-8") as f:
|
||||
return f.read()
|
||||
|
||||
|
||||
def configuration_to_dict(configuration):
|
||||
def configuration_to_dict(configuration: config.Configuration) -> types.CONFIG:
|
||||
"""Convert configuration to a dict with raw values."""
|
||||
return {section: {option: configuration.get_raw(section, option)
|
||||
for option in configuration.options(section)
|
||||
|
|
8
radicale/tests/static/contact_photo_with_data_uri.vcf
Normal file
8
radicale/tests/static/contact_photo_with_data_uri.vcf
Normal file
|
@ -0,0 +1,8 @@
|
|||
BEGIN:VCARD
|
||||
VERSION:3.0
|
||||
UID:contact
|
||||
N:Contact;;;;
|
||||
FN:Contact
|
||||
NICKNAME:test
|
||||
PHOTO;ENCODING=b;TYPE=png:data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAIAAACQd1PeAAAAD0lEQVQIHQEEAPv/AP///wX+Av4DfRnGAAAAAElFTkSuQmCC
|
||||
END:VCARD
|
33
radicale/tests/static/event_mixed_datetime_and_date.ics
Normal file
33
radicale/tests/static/event_mixed_datetime_and_date.ics
Normal file
|
@ -0,0 +1,33 @@
|
|||
BEGIN:VCALENDAR
|
||||
PRODID:-//Mozilla.org/NONSGML Mozilla Calendar V1.1//EN
|
||||
VERSION:2.0
|
||||
BEGIN:VTIMEZONE
|
||||
TZID:Europe/Paris
|
||||
X-LIC-LOCATION:Europe/Paris
|
||||
BEGIN:DAYLIGHT
|
||||
TZOFFSETFROM:+0100
|
||||
TZOFFSETTO:+0200
|
||||
TZNAME:CEST
|
||||
DTSTART:19700329T020000
|
||||
RRULE:FREQ=YEARLY;BYDAY=-1SU;BYMONTH=3
|
||||
END:DAYLIGHT
|
||||
BEGIN:STANDARD
|
||||
TZOFFSETFROM:+0200
|
||||
TZOFFSETTO:+0100
|
||||
TZNAME:CET
|
||||
DTSTART:19701025T030000
|
||||
RRULE:FREQ=YEARLY;BYDAY=-1SU;BYMONTH=10
|
||||
END:STANDARD
|
||||
END:VTIMEZONE
|
||||
BEGIN:VEVENT
|
||||
CREATED:20130902T150157Z
|
||||
LAST-MODIFIED:20130902T150158Z
|
||||
DTSTAMP:20130902T150158Z
|
||||
UID:event_mixed_datetime_and_date
|
||||
SUMMARY:Event
|
||||
DTSTART;TZID=Europe/Paris:20130901T180000
|
||||
DTEND;TZID=Europe/Paris:20130901T190000
|
||||
RRULE:FREQ=DAILY;COUNT=3
|
||||
EXDATE;VALUE=DATE:20130902
|
||||
END:VEVENT
|
||||
END:VCALENDAR
|
16
radicale/tests/static/event_multiple_case_sensitive_uids.ics
Normal file
16
radicale/tests/static/event_multiple_case_sensitive_uids.ics
Normal file
|
@ -0,0 +1,16 @@
|
|||
BEGIN:VCALENDAR
|
||||
PRODID:-//Mozilla.org/NONSGML Mozilla Calendar V1.1//EN
|
||||
VERSION:2.0
|
||||
BEGIN:VEVENT
|
||||
UID:event
|
||||
SUMMARY:Event 1
|
||||
DTSTART:20130901T190000
|
||||
DTEND:20130901T200000
|
||||
END:VEVENT
|
||||
BEGIN:VEVENT
|
||||
UID:EVENT
|
||||
SUMMARY:Event 2
|
||||
DTSTART:20130901T200000
|
||||
DTEND:20130901T210000
|
||||
END:VEVENT
|
||||
END:VCALENDAR
|
9
radicale/tests/static/mkcol_make_calendar.xml
Normal file
9
radicale/tests/static/mkcol_make_calendar.xml
Normal file
|
@ -0,0 +1,9 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<D:mkcol xmlns:D="DAV:" xmlns:C="urn:ietf:params:xml:ns:caldav">
|
||||
<D:set>
|
||||
<D:prop>
|
||||
<D:resourcetype><collection /><C:calendar /></D:resourcetype>
|
||||
<I:calendar-color xmlns:I="http://apple.com/ns/ical/">#BADA55</I:calendar-color>
|
||||
</D:prop>
|
||||
</D:set>
|
||||
</D:mkcol>
|
7
radicale/tests/static/propfind_multiple.xml
Normal file
7
radicale/tests/static/propfind_multiple.xml
Normal file
|
@ -0,0 +1,7 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<D:propfind xmlns:D="DAV:">
|
||||
<D:prop>
|
||||
<I:calendar-color xmlns:I="http://apple.com/ns/ical/" />
|
||||
<C:calendar-description xmlns:C="urn:ietf:params:xml:ns:caldav" />
|
||||
</D:prop>
|
||||
</D:propfind>
|
|
@ -0,0 +1,8 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<D:propertyupdate xmlns:D="DAV:">
|
||||
<D:remove>
|
||||
<D:prop>
|
||||
<I:calendar-color xmlns:I="http://apple.com/ns/ical/" />
|
||||
</D:prop>
|
||||
</D:remove>
|
||||
</D:propertyupdate>
|
9
radicale/tests/static/proppatch_remove_multiple1.xml
Normal file
9
radicale/tests/static/proppatch_remove_multiple1.xml
Normal file
|
@ -0,0 +1,9 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<D:propertyupdate xmlns:D="DAV:">
|
||||
<D:remove>
|
||||
<D:prop>
|
||||
<I:calendar-color xmlns:I="http://apple.com/ns/ical/" />
|
||||
<C:calendar-description xmlns:C="urn:ietf:params:xml:ns:caldav" />
|
||||
</D:prop>
|
||||
</D:remove>
|
||||
</D:propertyupdate>
|
13
radicale/tests/static/proppatch_remove_multiple2.xml
Normal file
13
radicale/tests/static/proppatch_remove_multiple2.xml
Normal file
|
@ -0,0 +1,13 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<D:propertyupdate xmlns:D="DAV:">
|
||||
<D:remove>
|
||||
<D:prop>
|
||||
<I:calendar-color xmlns:I="http://apple.com/ns/ical/" />
|
||||
</D:prop>
|
||||
</D:remove>
|
||||
<D:remove>
|
||||
<D:prop>
|
||||
<C:calendar-description xmlns:C="urn:ietf:params:xml:ns:caldav" />
|
||||
</D:prop>
|
||||
</D:remove>
|
||||
</D:propertyupdate>
|
13
radicale/tests/static/proppatch_set_and_remove.xml
Normal file
13
radicale/tests/static/proppatch_set_and_remove.xml
Normal file
|
@ -0,0 +1,13 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<D:propertyupdate xmlns:D="DAV:">
|
||||
<D:remove>
|
||||
<D:prop>
|
||||
<I:calendar-color xmlns:I="http://apple.com/ns/ical/" />
|
||||
</D:prop>
|
||||
</D:remove>
|
||||
<D:set>
|
||||
<D:prop>
|
||||
<C:calendar-description xmlns:C="urn:ietf:params:xml:ns:caldav">test2</C:calendar-description>
|
||||
</D:prop>
|
||||
</D:set>
|
||||
</D:propertyupdate>
|
9
radicale/tests/static/proppatch_set_multiple1.xml
Normal file
9
radicale/tests/static/proppatch_set_multiple1.xml
Normal file
|
@ -0,0 +1,9 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<D:propertyupdate xmlns:D="DAV:">
|
||||
<D:set>
|
||||
<D:prop>
|
||||
<I:calendar-color xmlns:I="http://apple.com/ns/ical/">#BADA55</I:calendar-color>
|
||||
<C:calendar-description xmlns:C="urn:ietf:params:xml:ns:caldav">test</C:calendar-description>
|
||||
</D:prop>
|
||||
</D:set>
|
||||
</D:propertyupdate>
|
13
radicale/tests/static/proppatch_set_multiple2.xml
Normal file
13
radicale/tests/static/proppatch_set_multiple2.xml
Normal file
|
@ -0,0 +1,13 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<D:propertyupdate xmlns:D="DAV:">
|
||||
<D:set>
|
||||
<D:prop>
|
||||
<I:calendar-color xmlns:I="http://apple.com/ns/ical/">#BADA55</I:calendar-color>
|
||||
</D:prop>
|
||||
</D:set>
|
||||
<D:set>
|
||||
<D:prop>
|
||||
<C:calendar-description xmlns:C="urn:ietf:params:xml:ns:caldav">test</C:calendar-description>
|
||||
</D:prop>
|
||||
</D:set>
|
||||
</D:propertyupdate>
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2012-2016 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2019 Unrud <unrud@outlook.com>
|
||||
|
@ -22,12 +22,12 @@ Radicale tests with simple requests and authentication.
|
|||
"""
|
||||
|
||||
import os
|
||||
import shutil
|
||||
import tempfile
|
||||
import sys
|
||||
from typing import Iterable, Tuple, Union
|
||||
|
||||
import pytest
|
||||
|
||||
from radicale import Application, config, xmlutils
|
||||
from radicale import xmlutils
|
||||
from radicale.tests import BaseTest
|
||||
|
||||
|
||||
|
@ -37,43 +37,20 @@ class TestBaseAuthRequests(BaseTest):
|
|||
We should setup auth for each type before creating the Application object.
|
||||
|
||||
"""
|
||||
def setup(self):
|
||||
self.configuration = config.load()
|
||||
self.colpath = tempfile.mkdtemp()
|
||||
self.configuration.update({
|
||||
"storage": {"filesystem_folder": self.colpath,
|
||||
# Disable syncing to disk for better performance
|
||||
"_filesystem_fsync": "False"},
|
||||
# Set incorrect authentication delay to a very low value
|
||||
"auth": {"delay": "0.002"}}, "test", privileged=True)
|
||||
|
||||
def teardown(self):
|
||||
shutil.rmtree(self.colpath)
|
||||
|
||||
def _test_htpasswd(self, htpasswd_encryption, htpasswd_content,
|
||||
test_matrix="ascii"):
|
||||
def _test_htpasswd(self, htpasswd_encryption: str, htpasswd_content: str,
|
||||
test_matrix: Union[str, Iterable[Tuple[str, str, bool]]]
|
||||
= "ascii") -> None:
|
||||
"""Test htpasswd authentication with user "tmp" and password "bepo" for
|
||||
``test_matrix`` "ascii" or user "😀" and password "🔑" for
|
||||
``test_matrix`` "unicode"."""
|
||||
if htpasswd_encryption == "bcrypt":
|
||||
try:
|
||||
from passlib.exc import MissingBackendError
|
||||
from passlib.hash import bcrypt
|
||||
except ImportError:
|
||||
pytest.skip("passlib[bcrypt] is not installed")
|
||||
try:
|
||||
bcrypt.hash("test-bcrypt-backend")
|
||||
except MissingBackendError:
|
||||
pytest.skip("bcrypt backend for passlib is not installed")
|
||||
htpasswd_file_path = os.path.join(self.colpath, ".htpasswd")
|
||||
encoding = self.configuration.get("encoding", "stock")
|
||||
encoding: str = self.configuration.get("encoding", "stock")
|
||||
with open(htpasswd_file_path, "w", encoding=encoding) as f:
|
||||
f.write(htpasswd_content)
|
||||
self.configuration.update({
|
||||
"auth": {"type": "htpasswd",
|
||||
"htpasswd_filename": htpasswd_file_path,
|
||||
"htpasswd_encryption": htpasswd_encryption}}, "test")
|
||||
self.application = Application(self.configuration)
|
||||
self.configure({"auth": {"type": "htpasswd",
|
||||
"htpasswd_filename": htpasswd_file_path,
|
||||
"htpasswd_encryption": htpasswd_encryption}})
|
||||
if test_matrix == "ascii":
|
||||
test_matrix = (("tmp", "bepo", True), ("tmp", "tmp", False),
|
||||
("tmp", "", False), ("unk", "unk", False),
|
||||
|
@ -82,56 +59,57 @@ class TestBaseAuthRequests(BaseTest):
|
|||
test_matrix = (("😀", "🔑", True), ("😀", "🌹", False),
|
||||
("😁", "🔑", False), ("😀", "", False),
|
||||
("", "🔑", False), ("", "", False))
|
||||
elif isinstance(test_matrix, str):
|
||||
raise ValueError("Unknown test matrix %r" % test_matrix)
|
||||
for user, password, valid in test_matrix:
|
||||
self.propfind("/", check=207 if valid else 401,
|
||||
login="%s:%s" % (user, password))
|
||||
|
||||
def test_htpasswd_plain(self):
|
||||
def test_htpasswd_plain(self) -> None:
|
||||
self._test_htpasswd("plain", "tmp:bepo")
|
||||
|
||||
def test_htpasswd_plain_password_split(self):
|
||||
def test_htpasswd_plain_password_split(self) -> None:
|
||||
self._test_htpasswd("plain", "tmp:be:po", (
|
||||
("tmp", "be:po", True), ("tmp", "bepo", False)))
|
||||
|
||||
def test_htpasswd_plain_unicode(self):
|
||||
def test_htpasswd_plain_unicode(self) -> None:
|
||||
self._test_htpasswd("plain", "😀:🔑", "unicode")
|
||||
|
||||
def test_htpasswd_md5(self):
|
||||
def test_htpasswd_md5(self) -> None:
|
||||
self._test_htpasswd("md5", "tmp:$apr1$BI7VKCZh$GKW4vq2hqDINMr8uv7lDY/")
|
||||
|
||||
def test_htpasswd_md5_unicode(self):
|
||||
self._test_htpasswd(
|
||||
"md5", "😀:$apr1$w4ev89r1$29xO8EvJmS2HEAadQ5qy11", "unicode")
|
||||
|
||||
def test_htpasswd_bcrypt(self):
|
||||
def test_htpasswd_bcrypt(self) -> None:
|
||||
self._test_htpasswd("bcrypt", "tmp:$2y$05$oD7hbiQFQlvCM7zoalo/T.MssV3V"
|
||||
"NTRI3w5KDnj8NTUKJNWfVpvRq")
|
||||
|
||||
def test_htpasswd_bcrypt_unicode(self):
|
||||
def test_htpasswd_bcrypt_unicode(self) -> None:
|
||||
self._test_htpasswd("bcrypt", "😀:$2y$10$Oyz5aHV4MD9eQJbk6GPemOs4T6edK"
|
||||
"6U9Sqlzr.W1mMVCS8wJUftnW", "unicode")
|
||||
|
||||
def test_htpasswd_multi(self):
|
||||
def test_htpasswd_multi(self) -> None:
|
||||
self._test_htpasswd("plain", "ign:ign\ntmp:bepo")
|
||||
|
||||
@pytest.mark.skipif(os.name == "nt", reason="leading and trailing "
|
||||
@pytest.mark.skipif(sys.platform == "win32", reason="leading and trailing "
|
||||
"whitespaces not allowed in file names")
|
||||
def test_htpasswd_whitespace_user(self):
|
||||
def test_htpasswd_whitespace_user(self) -> None:
|
||||
for user in (" tmp", "tmp ", " tmp "):
|
||||
self._test_htpasswd("plain", "%s:bepo" % user, (
|
||||
(user, "bepo", True), ("tmp", "bepo", False)))
|
||||
|
||||
def test_htpasswd_whitespace_password(self):
|
||||
def test_htpasswd_whitespace_password(self) -> None:
|
||||
for password in (" bepo", "bepo ", " bepo "):
|
||||
self._test_htpasswd("plain", "tmp:%s" % password, (
|
||||
("tmp", password, True), ("tmp", "bepo", False)))
|
||||
|
||||
def test_htpasswd_comment(self):
|
||||
def test_htpasswd_comment(self) -> None:
|
||||
self._test_htpasswd("plain", "#comment\n #comment\n \ntmp:bepo\n\n")
|
||||
|
||||
def test_remote_user(self):
|
||||
self.configuration.update({"auth": {"type": "remote_user"}}, "test")
|
||||
self.application = Application(self.configuration)
|
||||
def test_remote_user(self) -> None:
|
||||
self.configure({"auth": {"type": "remote_user"}})
|
||||
_, responses = self.propfind("/", """\
|
||||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<propfind xmlns="DAV:">
|
||||
|
@ -139,14 +117,16 @@ class TestBaseAuthRequests(BaseTest):
|
|||
<current-user-principal />
|
||||
</prop>
|
||||
</propfind>""", REMOTE_USER="test")
|
||||
status, prop = responses["/"]["D:current-user-principal"]
|
||||
assert responses is not None
|
||||
response = responses["/"]
|
||||
assert not isinstance(response, int)
|
||||
status, prop = response["D:current-user-principal"]
|
||||
assert status == 200
|
||||
assert prop.find(xmlutils.make_clark("D:href")).text == "/test/"
|
||||
href_element = prop.find(xmlutils.make_clark("D:href"))
|
||||
assert href_element is not None and href_element.text == "/test/"
|
||||
|
||||
def test_http_x_remote_user(self):
|
||||
self.configuration.update(
|
||||
{"auth": {"type": "http_x_remote_user"}}, "test")
|
||||
self.application = Application(self.configuration)
|
||||
def test_http_x_remote_user(self) -> None:
|
||||
self.configure({"auth": {"type": "http_x_remote_user"}})
|
||||
_, responses = self.propfind("/", """\
|
||||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<propfind xmlns="DAV:">
|
||||
|
@ -154,13 +134,15 @@ class TestBaseAuthRequests(BaseTest):
|
|||
<current-user-principal />
|
||||
</prop>
|
||||
</propfind>""", HTTP_X_REMOTE_USER="test")
|
||||
status, prop = responses["/"]["D:current-user-principal"]
|
||||
assert responses is not None
|
||||
response = responses["/"]
|
||||
assert not isinstance(response, int)
|
||||
status, prop = response["D:current-user-principal"]
|
||||
assert status == 200
|
||||
assert prop.find(xmlutils.make_clark("D:href")).text == "/test/"
|
||||
href_element = prop.find(xmlutils.make_clark("D:href"))
|
||||
assert href_element is not None and href_element.text == "/test/"
|
||||
|
||||
def test_custom(self):
|
||||
def test_custom(self) -> None:
|
||||
"""Custom authentication."""
|
||||
self.configuration.update(
|
||||
{"auth": {"type": "radicale.tests.custom.auth"}}, "test")
|
||||
self.application = Application(self.configuration)
|
||||
self.configure({"auth": {"type": "radicale.tests.custom.auth"}})
|
||||
self.propfind("/tmp/", login="tmp:")
|
||||
|
|
File diff suppressed because it is too large
Load diff
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2019 Unrud <unrud@outlook.com>
|
||||
#
|
||||
# This library is free software: you can redistribute it and/or modify
|
||||
|
@ -18,23 +18,26 @@ import os
|
|||
import shutil
|
||||
import tempfile
|
||||
from configparser import RawConfigParser
|
||||
from typing import List, Tuple
|
||||
|
||||
import pytest
|
||||
|
||||
from radicale import config
|
||||
from radicale import config, types
|
||||
from radicale.tests.helpers import configuration_to_dict
|
||||
|
||||
|
||||
class TestConfig:
|
||||
"""Test the configuration."""
|
||||
|
||||
def setup(self):
|
||||
colpath: str
|
||||
|
||||
def setup(self) -> None:
|
||||
self.colpath = tempfile.mkdtemp()
|
||||
|
||||
def teardown(self):
|
||||
def teardown(self) -> None:
|
||||
shutil.rmtree(self.colpath)
|
||||
|
||||
def _write_config(self, config_dict, name):
|
||||
def _write_config(self, config_dict: types.CONFIG, name: str) -> str:
|
||||
parser = RawConfigParser()
|
||||
parser.read_dict(config_dict)
|
||||
config_path = os.path.join(self.colpath, name)
|
||||
|
@ -42,7 +45,7 @@ class TestConfig:
|
|||
parser.write(f)
|
||||
return config_path
|
||||
|
||||
def test_parse_compound_paths(self):
|
||||
def test_parse_compound_paths(self) -> None:
|
||||
assert len(config.parse_compound_paths()) == 0
|
||||
assert len(config.parse_compound_paths("")) == 0
|
||||
assert len(config.parse_compound_paths(None, "")) == 0
|
||||
|
@ -62,16 +65,16 @@ class TestConfig:
|
|||
assert os.path.basename(paths[i][0]) == name
|
||||
assert paths[i][1] is ignore_if_missing
|
||||
|
||||
def test_load_empty(self):
|
||||
def test_load_empty(self) -> None:
|
||||
config_path = self._write_config({}, "config")
|
||||
config.load([(config_path, False)])
|
||||
|
||||
def test_load_full(self):
|
||||
def test_load_full(self) -> None:
|
||||
config_path = self._write_config(
|
||||
configuration_to_dict(config.load()), "config")
|
||||
config.load([(config_path, False)])
|
||||
|
||||
def test_load_missing(self):
|
||||
def test_load_missing(self) -> None:
|
||||
config_path = os.path.join(self.colpath, "does_not_exist")
|
||||
config.load([(config_path, True)])
|
||||
with pytest.raises(Exception) as exc_info:
|
||||
|
@ -79,18 +82,20 @@ class TestConfig:
|
|||
e = exc_info.value
|
||||
assert "Failed to load config file %r" % config_path in str(e)
|
||||
|
||||
def test_load_multiple(self):
|
||||
def test_load_multiple(self) -> None:
|
||||
config_path1 = self._write_config({
|
||||
"server": {"hosts": "192.0.2.1:1111"}}, "config1")
|
||||
config_path2 = self._write_config({
|
||||
"server": {"max_connections": 1111}}, "config2")
|
||||
configuration = config.load([(config_path1, False),
|
||||
(config_path2, False)])
|
||||
assert len(configuration.get("server", "hosts")) == 1
|
||||
assert configuration.get("server", "hosts")[0] == ("192.0.2.1", 1111)
|
||||
server_hosts: List[Tuple[str, int]] = configuration.get(
|
||||
"server", "hosts")
|
||||
assert len(server_hosts) == 1
|
||||
assert server_hosts[0] == ("192.0.2.1", 1111)
|
||||
assert configuration.get("server", "max_connections") == 1111
|
||||
|
||||
def test_copy(self):
|
||||
def test_copy(self) -> None:
|
||||
configuration1 = config.load()
|
||||
configuration1.update({"server": {"max_connections": "1111"}}, "test")
|
||||
configuration2 = configuration1.copy()
|
||||
|
@ -98,14 +103,14 @@ class TestConfig:
|
|||
assert configuration1.get("server", "max_connections") == 1111
|
||||
assert configuration2.get("server", "max_connections") == 1112
|
||||
|
||||
def test_invalid_section(self):
|
||||
def test_invalid_section(self) -> None:
|
||||
configuration = config.load()
|
||||
with pytest.raises(Exception) as exc_info:
|
||||
configuration.update({"does_not_exist": {"x": "x"}}, "test")
|
||||
e = exc_info.value
|
||||
assert "Invalid section 'does_not_exist'" in str(e)
|
||||
|
||||
def test_invalid_option(self):
|
||||
def test_invalid_option(self) -> None:
|
||||
configuration = config.load()
|
||||
with pytest.raises(Exception) as exc_info:
|
||||
configuration.update({"server": {"x": "x"}}, "test")
|
||||
|
@ -113,7 +118,7 @@ class TestConfig:
|
|||
assert "Invalid option 'x'" in str(e)
|
||||
assert "section 'server'" in str(e)
|
||||
|
||||
def test_invalid_option_plugin(self):
|
||||
def test_invalid_option_plugin(self) -> None:
|
||||
configuration = config.load()
|
||||
with pytest.raises(Exception) as exc_info:
|
||||
configuration.update({"auth": {"x": "x"}}, "test")
|
||||
|
@ -121,7 +126,7 @@ class TestConfig:
|
|||
assert "Invalid option 'x'" in str(e)
|
||||
assert "section 'auth'" in str(e)
|
||||
|
||||
def test_invalid_value(self):
|
||||
def test_invalid_value(self) -> None:
|
||||
configuration = config.load()
|
||||
with pytest.raises(Exception) as exc_info:
|
||||
configuration.update({"server": {"max_connections": "x"}}, "test")
|
||||
|
@ -131,7 +136,7 @@ class TestConfig:
|
|||
assert "section 'server" in str(e)
|
||||
assert "'x'" in str(e)
|
||||
|
||||
def test_privileged(self):
|
||||
def test_privileged(self) -> None:
|
||||
configuration = config.load()
|
||||
configuration.update({"server": {"_internal_server": "True"}},
|
||||
"test", privileged=True)
|
||||
|
@ -141,9 +146,9 @@ class TestConfig:
|
|||
e = exc_info.value
|
||||
assert "Invalid option '_internal_server'" in str(e)
|
||||
|
||||
def test_plugin_schema(self):
|
||||
plugin_schema = {"auth": {"new_option": {"value": "False",
|
||||
"type": bool}}}
|
||||
def test_plugin_schema(self) -> None:
|
||||
plugin_schema: types.CONFIG_SCHEMA = {
|
||||
"auth": {"new_option": {"value": "False", "type": bool}}}
|
||||
configuration = config.load()
|
||||
configuration.update({"auth": {"type": "new_plugin"}}, "test")
|
||||
plugin_configuration = configuration.copy(plugin_schema)
|
||||
|
@ -152,26 +157,26 @@ class TestConfig:
|
|||
plugin_configuration = configuration.copy(plugin_schema)
|
||||
assert plugin_configuration.get("auth", "new_option") is True
|
||||
|
||||
def test_plugin_schema_duplicate_option(self):
|
||||
plugin_schema = {"auth": {"type": {"value": "False",
|
||||
"type": bool}}}
|
||||
def test_plugin_schema_duplicate_option(self) -> None:
|
||||
plugin_schema: types.CONFIG_SCHEMA = {
|
||||
"auth": {"type": {"value": "False", "type": bool}}}
|
||||
configuration = config.load()
|
||||
with pytest.raises(Exception) as exc_info:
|
||||
configuration.copy(plugin_schema)
|
||||
e = exc_info.value
|
||||
assert "option already exists in 'auth': 'type'" in str(e)
|
||||
|
||||
def test_plugin_schema_invalid(self):
|
||||
plugin_schema = {"server": {"new_option": {"value": "False",
|
||||
"type": bool}}}
|
||||
def test_plugin_schema_invalid(self) -> None:
|
||||
plugin_schema: types.CONFIG_SCHEMA = {
|
||||
"server": {"new_option": {"value": "False", "type": bool}}}
|
||||
configuration = config.load()
|
||||
with pytest.raises(Exception) as exc_info:
|
||||
configuration.copy(plugin_schema)
|
||||
e = exc_info.value
|
||||
assert "not a plugin section: 'server" in str(e)
|
||||
|
||||
def test_plugin_schema_option_invalid(self):
|
||||
plugin_schema = {"auth": {}}
|
||||
def test_plugin_schema_option_invalid(self) -> None:
|
||||
plugin_schema: types.CONFIG_SCHEMA = {"auth": {}}
|
||||
configuration = config.load()
|
||||
configuration.update({"auth": {"type": "new_plugin",
|
||||
"new_option": False}}, "test")
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2017-2019 Unrud <unrud@outlook.com>
|
||||
#
|
||||
# This library is free software: you can redistribute it and/or modify
|
||||
|
@ -19,10 +19,7 @@ Radicale tests with simple requests and rights.
|
|||
"""
|
||||
|
||||
import os
|
||||
import shutil
|
||||
import tempfile
|
||||
|
||||
from radicale import Application, config
|
||||
from radicale.tests import BaseTest
|
||||
from radicale.tests.helpers import get_file_content
|
||||
|
||||
|
@ -30,38 +27,25 @@ from radicale.tests.helpers import get_file_content
|
|||
class TestBaseRightsRequests(BaseTest):
|
||||
"""Tests basic requests with rights."""
|
||||
|
||||
def setup(self):
|
||||
self.configuration = config.load()
|
||||
self.colpath = tempfile.mkdtemp()
|
||||
self.configuration.update({
|
||||
"storage": {"filesystem_folder": self.colpath,
|
||||
# Disable syncing to disk for better performance
|
||||
"_filesystem_fsync": "False"}},
|
||||
"test", privileged=True)
|
||||
|
||||
def teardown(self):
|
||||
shutil.rmtree(self.colpath)
|
||||
|
||||
def _test_rights(self, rights_type, user, path, mode, expected_status,
|
||||
with_auth=True):
|
||||
def _test_rights(self, rights_type: str, user: str, path: str, mode: str,
|
||||
expected_status: int, with_auth: bool = True) -> None:
|
||||
assert mode in ("r", "w")
|
||||
assert user in ("", "tmp")
|
||||
htpasswd_file_path = os.path.join(self.colpath, ".htpasswd")
|
||||
with open(htpasswd_file_path, "w") as f:
|
||||
f.write("tmp:bepo\nother:bepo")
|
||||
self.configuration.update({
|
||||
self.configure({
|
||||
"rights": {"type": rights_type},
|
||||
"auth": {"type": "htpasswd" if with_auth else "none",
|
||||
"htpasswd_filename": htpasswd_file_path,
|
||||
"htpasswd_encryption": "plain"}}, "test")
|
||||
self.application = Application(self.configuration)
|
||||
"htpasswd_encryption": "plain"}})
|
||||
for u in ("tmp", "other"):
|
||||
# Indirect creation of principal collection
|
||||
self.propfind("/%s/" % u, login="%s:bepo" % u)
|
||||
(self.propfind if mode == "r" else self.proppatch)(
|
||||
path, check=expected_status, login="tmp:bepo" if user else None)
|
||||
|
||||
def test_owner_only(self):
|
||||
def test_owner_only(self) -> None:
|
||||
self._test_rights("owner_only", "", "/", "r", 401)
|
||||
self._test_rights("owner_only", "", "/", "w", 401)
|
||||
self._test_rights("owner_only", "", "/tmp/", "r", 401)
|
||||
|
@ -73,13 +57,13 @@ class TestBaseRightsRequests(BaseTest):
|
|||
self._test_rights("owner_only", "tmp", "/other/", "r", 403)
|
||||
self._test_rights("owner_only", "tmp", "/other/", "w", 403)
|
||||
|
||||
def test_owner_only_without_auth(self):
|
||||
def test_owner_only_without_auth(self) -> None:
|
||||
self._test_rights("owner_only", "", "/", "r", 207, False)
|
||||
self._test_rights("owner_only", "", "/", "w", 401, False)
|
||||
self._test_rights("owner_only", "", "/tmp/", "r", 207, False)
|
||||
self._test_rights("owner_only", "", "/tmp/", "w", 207, False)
|
||||
|
||||
def test_owner_write(self):
|
||||
def test_owner_write(self) -> None:
|
||||
self._test_rights("owner_write", "", "/", "r", 401)
|
||||
self._test_rights("owner_write", "", "/", "w", 401)
|
||||
self._test_rights("owner_write", "", "/tmp/", "r", 401)
|
||||
|
@ -91,13 +75,13 @@ class TestBaseRightsRequests(BaseTest):
|
|||
self._test_rights("owner_write", "tmp", "/other/", "r", 207)
|
||||
self._test_rights("owner_write", "tmp", "/other/", "w", 403)
|
||||
|
||||
def test_owner_write_without_auth(self):
|
||||
def test_owner_write_without_auth(self) -> None:
|
||||
self._test_rights("owner_write", "", "/", "r", 207, False)
|
||||
self._test_rights("owner_write", "", "/", "w", 401, False)
|
||||
self._test_rights("owner_write", "", "/tmp/", "r", 207, False)
|
||||
self._test_rights("owner_write", "", "/tmp/", "w", 207, False)
|
||||
|
||||
def test_authenticated(self):
|
||||
def test_authenticated(self) -> None:
|
||||
self._test_rights("authenticated", "", "/", "r", 401)
|
||||
self._test_rights("authenticated", "", "/", "w", 401)
|
||||
self._test_rights("authenticated", "", "/tmp/", "r", 401)
|
||||
|
@ -109,13 +93,13 @@ class TestBaseRightsRequests(BaseTest):
|
|||
self._test_rights("authenticated", "tmp", "/other/", "r", 207)
|
||||
self._test_rights("authenticated", "tmp", "/other/", "w", 207)
|
||||
|
||||
def test_authenticated_without_auth(self):
|
||||
def test_authenticated_without_auth(self) -> None:
|
||||
self._test_rights("authenticated", "", "/", "r", 207, False)
|
||||
self._test_rights("authenticated", "", "/", "w", 207, False)
|
||||
self._test_rights("authenticated", "", "/tmp/", "r", 207, False)
|
||||
self._test_rights("authenticated", "", "/tmp/", "w", 207, False)
|
||||
|
||||
def test_from_file(self):
|
||||
def test_from_file(self) -> None:
|
||||
rights_file_path = os.path.join(self.colpath, "rights")
|
||||
with open(rights_file_path, "w") as f:
|
||||
f.write("""\
|
||||
|
@ -127,8 +111,7 @@ permissions: RrWw
|
|||
user: .*
|
||||
collection: custom(/.*)?
|
||||
permissions: Rr""")
|
||||
self.configuration.update(
|
||||
{"rights": {"file": rights_file_path}}, "test")
|
||||
self.configure({"rights": {"file": rights_file_path}})
|
||||
self._test_rights("from_file", "", "/other/", "r", 401)
|
||||
self._test_rights("from_file", "tmp", "/other/", "r", 403)
|
||||
self._test_rights("from_file", "", "/custom/sub", "r", 404)
|
||||
|
@ -148,10 +131,8 @@ permissions: RrWw
|
|||
user: .*
|
||||
collection: public/[^/]*
|
||||
permissions: i""")
|
||||
self.configuration.update(
|
||||
{"rights": {"type": "from_file",
|
||||
"file": rights_file_path}}, "test")
|
||||
self.application = Application(self.configuration)
|
||||
self.configure({"rights": {"type": "from_file",
|
||||
"file": rights_file_path}})
|
||||
self.mkcalendar("/tmp/calendar", login="tmp:bepo")
|
||||
self.mkcol("/public", login="tmp:bepo")
|
||||
self.mkcalendar("/public/calendar", login="tmp:bepo")
|
||||
|
@ -160,13 +141,13 @@ permissions: i""")
|
|||
self.get("/public/calendar")
|
||||
self.get("/public/calendar/1.ics", check=401)
|
||||
|
||||
def test_custom(self):
|
||||
def test_custom(self) -> None:
|
||||
"""Custom rights management."""
|
||||
self._test_rights("radicale.tests.custom.rights", "", "/", "r", 401)
|
||||
self._test_rights(
|
||||
"radicale.tests.custom.rights", "", "/tmp/", "r", 207)
|
||||
|
||||
def test_collections_and_items(self):
|
||||
def test_collections_and_items(self) -> None:
|
||||
"""Test rights for creation of collections, calendars and items.
|
||||
|
||||
Collections are allowed at "/" and "/.../".
|
||||
|
@ -174,7 +155,6 @@ permissions: i""")
|
|||
Items are allowed at "/.../.../...".
|
||||
|
||||
"""
|
||||
self.application = Application(self.configuration)
|
||||
self.mkcalendar("/", check=401)
|
||||
self.mkcalendar("/user/", check=401)
|
||||
self.mkcol("/user/")
|
||||
|
@ -183,9 +163,8 @@ permissions: i""")
|
|||
self.mkcol("/user/calendar/item", check=401)
|
||||
self.mkcalendar("/user/calendar/item", check=401)
|
||||
|
||||
def test_put_collections_and_items(self):
|
||||
def test_put_collections_and_items(self) -> None:
|
||||
"""Test rights for creation of calendars and items with PUT."""
|
||||
self.application = Application(self.configuration)
|
||||
self.put("/user/", "BEGIN:VCALENDAR\r\nEND:VCALENDAR", check=401)
|
||||
self.mkcol("/user/")
|
||||
self.put("/user/calendar/", "BEGIN:VCALENDAR\r\nEND:VCALENDAR")
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2018-2019 Unrud <unrud@outlook.com>
|
||||
#
|
||||
# This library is free software: you can redistribute it and/or modify
|
||||
|
@ -21,15 +21,15 @@ Test the internal server.
|
|||
|
||||
import errno
|
||||
import os
|
||||
import shutil
|
||||
import socket
|
||||
import ssl
|
||||
import subprocess
|
||||
import sys
|
||||
import tempfile
|
||||
import threading
|
||||
import time
|
||||
from configparser import RawConfigParser
|
||||
from http.client import HTTPMessage
|
||||
from typing import IO, Callable, Dict, Optional, Tuple, cast
|
||||
from urllib import request
|
||||
from urllib.error import HTTPError, URLError
|
||||
|
||||
|
@ -41,31 +41,29 @@ from radicale.tests.helpers import configuration_to_dict, get_file_path
|
|||
|
||||
|
||||
class DisabledRedirectHandler(request.HTTPRedirectHandler):
|
||||
def http_error_302(self, req, fp, code, msg, headers):
|
||||
raise HTTPError(req.full_url, code, msg, headers, fp)
|
||||
|
||||
http_error_301 = http_error_303 = http_error_307 = http_error_302
|
||||
def redirect_request(
|
||||
self, req: request.Request, fp: IO[bytes], code: int, msg: str,
|
||||
headers: HTTPMessage, newurl: str) -> None:
|
||||
return None
|
||||
|
||||
|
||||
class TestBaseServerRequests(BaseTest):
|
||||
"""Test the internal server."""
|
||||
|
||||
def setup(self):
|
||||
self.configuration = config.load()
|
||||
self.colpath = tempfile.mkdtemp()
|
||||
shutdown_socket: socket.socket
|
||||
thread: threading.Thread
|
||||
opener: request.OpenerDirector
|
||||
|
||||
def setup(self) -> None:
|
||||
super().setup()
|
||||
self.shutdown_socket, shutdown_socket_out = socket.socketpair()
|
||||
with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as sock:
|
||||
# Find available port
|
||||
sock.bind(("127.0.0.1", 0))
|
||||
self.sockname = sock.getsockname()
|
||||
self.configuration.update({
|
||||
"storage": {"filesystem_folder": self.colpath,
|
||||
# Disable syncing to disk for better performance
|
||||
"_filesystem_fsync": "False"},
|
||||
"server": {"hosts": "[%s]:%d" % self.sockname},
|
||||
# Enable debugging for new processes
|
||||
"logging": {"level": "debug"}},
|
||||
"test", privileged=True)
|
||||
self.configure({"server": {"hosts": "[%s]:%d" % self.sockname},
|
||||
# Enable debugging for new processes
|
||||
"logging": {"level": "debug"}})
|
||||
self.thread = threading.Thread(target=server.serve, args=(
|
||||
self.configuration, shutdown_socket_out))
|
||||
ssl_context = ssl.create_default_context()
|
||||
|
@ -75,48 +73,67 @@ class TestBaseServerRequests(BaseTest):
|
|||
request.HTTPSHandler(context=ssl_context),
|
||||
DisabledRedirectHandler)
|
||||
|
||||
def teardown(self):
|
||||
def teardown(self) -> None:
|
||||
self.shutdown_socket.close()
|
||||
try:
|
||||
self.thread.join()
|
||||
except RuntimeError: # Thread never started
|
||||
pass
|
||||
shutil.rmtree(self.colpath)
|
||||
super().teardown()
|
||||
|
||||
def request(self, method, path, data=None, is_alive_fn=None, **headers):
|
||||
def request(self, method: str, path: str, data: Optional[str] = None,
|
||||
check: Optional[int] = None, **kwargs
|
||||
) -> Tuple[int, Dict[str, str], str]:
|
||||
"""Send a request."""
|
||||
login = kwargs.pop("login", None)
|
||||
if login is not None and not isinstance(login, str):
|
||||
raise TypeError("login argument must be %r, not %r" %
|
||||
(str, type(login)))
|
||||
if login:
|
||||
raise NotImplementedError
|
||||
is_alive_fn: Optional[Callable[[], bool]] = kwargs.pop(
|
||||
"is_alive_fn", None)
|
||||
headers: Dict[str, str] = kwargs
|
||||
for k, v in headers.items():
|
||||
if not isinstance(v, str):
|
||||
raise TypeError("type of %r is %r, expected %r" %
|
||||
(k, type(v), str))
|
||||
if is_alive_fn is None:
|
||||
is_alive_fn = self.thread.is_alive
|
||||
scheme = ("https" if self.configuration.get("server", "ssl") else
|
||||
"http")
|
||||
encoding: str = self.configuration.get("encoding", "request")
|
||||
scheme = "https" if self.configuration.get("server", "ssl") else "http"
|
||||
data_bytes = None
|
||||
if data:
|
||||
data_bytes = data.encode(encoding)
|
||||
req = request.Request(
|
||||
"%s://[%s]:%d%s" % (scheme, *self.sockname, path),
|
||||
data=data, headers=headers, method=method)
|
||||
data=data_bytes, headers=headers, method=method)
|
||||
while True:
|
||||
assert is_alive_fn()
|
||||
try:
|
||||
with self.opener.open(req) as f:
|
||||
return f.getcode(), f.info(), f.read().decode()
|
||||
return f.getcode(), dict(f.info()), f.read().decode()
|
||||
except HTTPError as e:
|
||||
return e.code, e.headers, e.read().decode()
|
||||
assert check is None or e.code == check, "%d != %d" % (e.code,
|
||||
check)
|
||||
return e.code, dict(e.headers), e.read().decode()
|
||||
except URLError as e:
|
||||
if not isinstance(e.reason, ConnectionRefusedError):
|
||||
raise
|
||||
time.sleep(0.1)
|
||||
|
||||
def test_root(self):
|
||||
def test_root(self) -> None:
|
||||
self.thread.start()
|
||||
self.get("/", check=302)
|
||||
|
||||
def test_ssl(self):
|
||||
self.configuration.update({
|
||||
"server": {"ssl": "True",
|
||||
"certificate": get_file_path("cert.pem"),
|
||||
"key": get_file_path("key.pem")}}, "test")
|
||||
def test_ssl(self) -> None:
|
||||
self.configure({"server": {"ssl": "True",
|
||||
"certificate": get_file_path("cert.pem"),
|
||||
"key": get_file_path("key.pem")}})
|
||||
self.thread.start()
|
||||
self.get("/", check=302)
|
||||
|
||||
def test_bind_fail(self):
|
||||
def test_bind_fail(self) -> None:
|
||||
for address_family, address in [(socket.AF_INET, "::1"),
|
||||
(socket.AF_INET6, "127.0.0.1")]:
|
||||
with socket.socket(address_family, socket.SOCK_STREAM) as sock:
|
||||
|
@ -132,10 +149,11 @@ class TestBaseServerRequests(BaseTest):
|
|||
socket.EAI_NONAME, server.COMPAT_EAI_ADDRFAMILY,
|
||||
server.COMPAT_EAI_NODATA) or
|
||||
str(exc_info.value) == "address family mismatched" or
|
||||
exc_info.value.errno == errno.EADDRNOTAVAIL or
|
||||
exc_info.value.errno == errno.EAFNOSUPPORT)
|
||||
exc_info.value.errno in (
|
||||
errno.EADDRNOTAVAIL, errno.EAFNOSUPPORT,
|
||||
errno.EPROTONOSUPPORT))
|
||||
|
||||
def test_ipv6(self):
|
||||
def test_ipv6(self) -> None:
|
||||
try:
|
||||
with socket.socket(socket.AF_INET6, socket.SOCK_STREAM) as sock:
|
||||
# Only allow IPv6 connections to the IPv6 socket
|
||||
|
@ -145,44 +163,54 @@ class TestBaseServerRequests(BaseTest):
|
|||
sock.bind(("::1", 0))
|
||||
self.sockname = sock.getsockname()[:2]
|
||||
except OSError as e:
|
||||
if e.errno in (errno.EADDRNOTAVAIL, errno.EAFNOSUPPORT):
|
||||
if e.errno in (errno.EADDRNOTAVAIL, errno.EAFNOSUPPORT,
|
||||
errno.EPROTONOSUPPORT):
|
||||
pytest.skip("IPv6 not supported")
|
||||
raise
|
||||
self.configuration.update({
|
||||
"server": {"hosts": "[%s]:%d" % self.sockname}}, "test")
|
||||
self.configure({"server": {"hosts": "[%s]:%d" % self.sockname}})
|
||||
self.thread.start()
|
||||
self.get("/", check=302)
|
||||
|
||||
def test_command_line_interface(self):
|
||||
def test_command_line_interface(self, with_bool_options=False) -> None:
|
||||
self.configure({"headers": {"Test-Server": "test"}})
|
||||
config_args = []
|
||||
for section, values in config.DEFAULT_CONFIG_SCHEMA.items():
|
||||
for section in self.configuration.sections():
|
||||
if section.startswith("_"):
|
||||
continue
|
||||
for option, data in values.items():
|
||||
for option in self.configuration.options(section):
|
||||
if option.startswith("_"):
|
||||
continue
|
||||
long_name = "--%s-%s" % (section, option.replace("_", "-"))
|
||||
if data["type"] == bool:
|
||||
if not self.configuration.get(section, option):
|
||||
if with_bool_options and config.DEFAULT_CONFIG_SCHEMA.get(
|
||||
section, {}).get(option, {}).get("type") == bool:
|
||||
if not cast(bool, self.configuration.get(section, option)):
|
||||
long_name = "--no%s" % long_name[1:]
|
||||
config_args.append(long_name)
|
||||
else:
|
||||
config_args.append(long_name)
|
||||
config_args.append(
|
||||
self.configuration.get_raw(section, option))
|
||||
env = os.environ.copy()
|
||||
env["PYTHONPATH"] = os.pathsep.join(sys.path)
|
||||
raw_value = self.configuration.get_raw(section, option)
|
||||
assert isinstance(raw_value, str)
|
||||
config_args.append(raw_value)
|
||||
config_args.append("--headers-Test-Header=test")
|
||||
p = subprocess.Popen(
|
||||
[sys.executable, "-m", "radicale"] + config_args, env=env)
|
||||
[sys.executable, "-m", "radicale"] + config_args,
|
||||
env={**os.environ, "PYTHONPATH": os.pathsep.join(sys.path)})
|
||||
try:
|
||||
self.get("/", is_alive_fn=lambda: p.poll() is None, check=302)
|
||||
status, headers, _ = self.request(
|
||||
"GET", "/", check=302, is_alive_fn=lambda: p.poll() is None)
|
||||
for key in self.configuration.options("headers"):
|
||||
assert headers.get(key) == self.configuration.get(
|
||||
"headers", key)
|
||||
finally:
|
||||
p.terminate()
|
||||
p.wait()
|
||||
if os.name == "posix":
|
||||
if sys.platform != "win32":
|
||||
assert p.returncode == 0
|
||||
|
||||
def test_wsgi_server(self):
|
||||
def test_command_line_interface_with_bool_options(self) -> None:
|
||||
self.test_command_line_interface(with_bool_options=True)
|
||||
|
||||
def test_wsgi_server(self) -> None:
|
||||
config_path = os.path.join(self.colpath, "config")
|
||||
parser = RawConfigParser()
|
||||
parser.read_dict(configuration_to_dict(self.configuration))
|
||||
|
@ -191,9 +219,10 @@ class TestBaseServerRequests(BaseTest):
|
|||
env = os.environ.copy()
|
||||
env["PYTHONPATH"] = os.pathsep.join(sys.path)
|
||||
env["RADICALE_CONFIG"] = config_path
|
||||
raw_server_hosts = self.configuration.get_raw("server", "hosts")
|
||||
assert isinstance(raw_server_hosts, str)
|
||||
p = subprocess.Popen([
|
||||
sys.executable, "-m", "waitress",
|
||||
"--listen", self.configuration.get_raw("server", "hosts"),
|
||||
sys.executable, "-m", "waitress", "--listen", raw_server_hosts,
|
||||
"radicale:application"], env=env)
|
||||
try:
|
||||
self.get("/", is_alive_fn=lambda: p.poll() is None, check=302)
|
||||
|
|
189
radicale/tests/test_storage.py
Normal file
189
radicale/tests/test_storage.py
Normal file
|
@ -0,0 +1,189 @@
|
|||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2019 Unrud <unrud@outlook.com>
|
||||
#
|
||||
# This library is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with Radicale. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
"""
|
||||
Tests for storage backends.
|
||||
|
||||
"""
|
||||
|
||||
import os
|
||||
import shutil
|
||||
from typing import ClassVar, cast
|
||||
|
||||
import pytest
|
||||
|
||||
import radicale.tests.custom.storage_simple_sync
|
||||
from radicale.tests import BaseTest
|
||||
from radicale.tests.helpers import get_file_content
|
||||
from radicale.tests.test_base import TestBaseRequests as _TestBaseRequests
|
||||
|
||||
|
||||
class TestMultiFileSystem(BaseTest):
|
||||
"""Tests for multifilesystem."""
|
||||
|
||||
def setup(self) -> None:
|
||||
_TestBaseRequests.setup(cast(_TestBaseRequests, self))
|
||||
self.configure({"storage": {"type": "multifilesystem"}})
|
||||
|
||||
def test_folder_creation(self) -> None:
|
||||
"""Verify that the folder is created."""
|
||||
folder = os.path.join(self.colpath, "subfolder")
|
||||
self.configure({"storage": {"filesystem_folder": folder}})
|
||||
assert os.path.isdir(folder)
|
||||
|
||||
def test_fsync(self) -> None:
|
||||
"""Create a directory and file with syncing enabled."""
|
||||
self.configure({"storage": {"_filesystem_fsync": "True"}})
|
||||
self.mkcalendar("/calendar.ics/")
|
||||
|
||||
def test_hook(self) -> None:
|
||||
"""Run hook."""
|
||||
self.configure({"storage": {"hook": "mkdir %s" % os.path.join(
|
||||
"collection-root", "created_by_hook")}})
|
||||
self.mkcalendar("/calendar.ics/")
|
||||
self.propfind("/created_by_hook/")
|
||||
|
||||
def test_hook_read_access(self) -> None:
|
||||
"""Verify that hook is not run for read accesses."""
|
||||
self.configure({"storage": {"hook": "mkdir %s" % os.path.join(
|
||||
"collection-root", "created_by_hook")}})
|
||||
self.propfind("/")
|
||||
self.propfind("/created_by_hook/", check=404)
|
||||
|
||||
@pytest.mark.skipif(not shutil.which("flock"),
|
||||
reason="flock command not found")
|
||||
def test_hook_storage_locked(self) -> None:
|
||||
"""Verify that the storage is locked when the hook runs."""
|
||||
self.configure({"storage": {"hook": (
|
||||
"flock -n .Radicale.lock || exit 0; exit 1")}})
|
||||
self.mkcalendar("/calendar.ics/")
|
||||
|
||||
def test_hook_principal_collection_creation(self) -> None:
|
||||
"""Verify that the hooks runs when a new user is created."""
|
||||
self.configure({"storage": {"hook": "mkdir %s" % os.path.join(
|
||||
"collection-root", "created_by_hook")}})
|
||||
self.propfind("/", login="user:")
|
||||
self.propfind("/created_by_hook/")
|
||||
|
||||
def test_hook_fail(self) -> None:
|
||||
"""Verify that a request fails if the hook fails."""
|
||||
self.configure({"storage": {"hook": "exit 1"}})
|
||||
self.mkcalendar("/calendar.ics/", check=500)
|
||||
|
||||
def test_item_cache_rebuild(self) -> None:
|
||||
"""Delete the item cache and verify that it is rebuild."""
|
||||
self.mkcalendar("/calendar.ics/")
|
||||
event = get_file_content("event1.ics")
|
||||
path = "/calendar.ics/event1.ics"
|
||||
self.put(path, event)
|
||||
_, answer1 = self.get(path)
|
||||
cache_folder = os.path.join(self.colpath, "collection-root",
|
||||
"calendar.ics", ".Radicale.cache", "item")
|
||||
assert os.path.exists(os.path.join(cache_folder, "event1.ics"))
|
||||
shutil.rmtree(cache_folder)
|
||||
_, answer2 = self.get(path)
|
||||
assert answer1 == answer2
|
||||
assert os.path.exists(os.path.join(cache_folder, "event1.ics"))
|
||||
|
||||
def test_put_whole_calendar_uids_used_as_file_names(self) -> None:
|
||||
"""Test if UIDs are used as file names."""
|
||||
_TestBaseRequests.test_put_whole_calendar(
|
||||
cast(_TestBaseRequests, self))
|
||||
for uid in ("todo", "event"):
|
||||
_, answer = self.get("/calendar.ics/%s.ics" % uid)
|
||||
assert "\r\nUID:%s\r\n" % uid in answer
|
||||
|
||||
def test_put_whole_calendar_random_uids_used_as_file_names(self) -> None:
|
||||
"""Test if UIDs are used as file names."""
|
||||
_TestBaseRequests.test_put_whole_calendar_without_uids(
|
||||
cast(_TestBaseRequests, self))
|
||||
_, answer = self.get("/calendar.ics")
|
||||
assert answer is not None
|
||||
uids = []
|
||||
for line in answer.split("\r\n"):
|
||||
if line.startswith("UID:"):
|
||||
uids.append(line[len("UID:"):])
|
||||
for uid in uids:
|
||||
_, answer = self.get("/calendar.ics/%s.ics" % uid)
|
||||
assert answer is not None
|
||||
assert "\r\nUID:%s\r\n" % uid in answer
|
||||
|
||||
def test_put_whole_addressbook_uids_used_as_file_names(self) -> None:
|
||||
"""Test if UIDs are used as file names."""
|
||||
_TestBaseRequests.test_put_whole_addressbook(
|
||||
cast(_TestBaseRequests, self))
|
||||
for uid in ("contact1", "contact2"):
|
||||
_, answer = self.get("/contacts.vcf/%s.vcf" % uid)
|
||||
assert "\r\nUID:%s\r\n" % uid in answer
|
||||
|
||||
def test_put_whole_addressbook_random_uids_used_as_file_names(
|
||||
self) -> None:
|
||||
"""Test if UIDs are used as file names."""
|
||||
_TestBaseRequests.test_put_whole_addressbook_without_uids(
|
||||
cast(_TestBaseRequests, self))
|
||||
_, answer = self.get("/contacts.vcf")
|
||||
assert answer is not None
|
||||
uids = []
|
||||
for line in answer.split("\r\n"):
|
||||
if line.startswith("UID:"):
|
||||
uids.append(line[len("UID:"):])
|
||||
for uid in uids:
|
||||
_, answer = self.get("/contacts.vcf/%s.vcf" % uid)
|
||||
assert answer is not None
|
||||
assert "\r\nUID:%s\r\n" % uid in answer
|
||||
|
||||
|
||||
class TestMultiFileSystemNoLock(BaseTest):
|
||||
"""Tests for multifilesystem_nolock."""
|
||||
|
||||
def setup(self) -> None:
|
||||
_TestBaseRequests.setup(cast(_TestBaseRequests, self))
|
||||
self.configure({"storage": {"type": "multifilesystem_nolock"}})
|
||||
|
||||
test_add_event = _TestBaseRequests.test_add_event
|
||||
test_item_cache_rebuild = TestMultiFileSystem.test_item_cache_rebuild
|
||||
|
||||
|
||||
class TestCustomStorageSystem(BaseTest):
|
||||
"""Test custom backend loading."""
|
||||
|
||||
def setup(self) -> None:
|
||||
_TestBaseRequests.setup(cast(_TestBaseRequests, self))
|
||||
self.configure({"storage": {
|
||||
"type": "radicale.tests.custom.storage_simple_sync"}})
|
||||
|
||||
full_sync_token_support: ClassVar[bool] = False
|
||||
|
||||
test_add_event = _TestBaseRequests.test_add_event
|
||||
_report_sync_token = _TestBaseRequests._report_sync_token
|
||||
# include tests related to sync token
|
||||
s: str = ""
|
||||
for s in dir(_TestBaseRequests):
|
||||
if s.startswith("test_") and "sync" in s.split("_"):
|
||||
locals()[s] = getattr(_TestBaseRequests, s)
|
||||
del s
|
||||
|
||||
|
||||
class TestCustomStorageSystemCallable(BaseTest):
|
||||
"""Test custom backend loading with ``callable``."""
|
||||
|
||||
def setup(self) -> None:
|
||||
_TestBaseRequests.setup(cast(_TestBaseRequests, self))
|
||||
self.configure({"storage": {
|
||||
"type": radicale.tests.custom.storage_simple_sync.Storage}})
|
||||
|
||||
test_add_event = _TestBaseRequests.test_add_event
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2018-2019 Unrud <unrud@outlook.com>
|
||||
#
|
||||
# This library is free software: you can redistribute it and/or modify
|
||||
|
@ -19,47 +19,31 @@ Test web plugin.
|
|||
|
||||
"""
|
||||
|
||||
import shutil
|
||||
import tempfile
|
||||
|
||||
from radicale import Application, config
|
||||
from radicale.tests import BaseTest
|
||||
|
||||
|
||||
class TestBaseWebRequests(BaseTest):
|
||||
"""Test web plugin."""
|
||||
|
||||
def setup(self):
|
||||
self.configuration = config.load()
|
||||
self.colpath = tempfile.mkdtemp()
|
||||
self.configuration.update({
|
||||
"storage": {"filesystem_folder": self.colpath,
|
||||
# Disable syncing to disk for better performance
|
||||
"_filesystem_fsync": "False"}},
|
||||
"test", privileged=True)
|
||||
self.application = Application(self.configuration)
|
||||
|
||||
def teardown(self):
|
||||
shutil.rmtree(self.colpath)
|
||||
|
||||
def test_internal(self):
|
||||
status, headers, _ = self.request("GET", "/.web")
|
||||
assert status == 302
|
||||
assert headers.get("Location") == ".web/"
|
||||
def test_internal(self) -> None:
|
||||
_, headers, _ = self.request("GET", "/.web", check=302)
|
||||
assert headers.get("Location") == "/.web/"
|
||||
_, answer = self.get("/.web/")
|
||||
assert answer
|
||||
self.post("/.web", check=405)
|
||||
|
||||
def test_none(self):
|
||||
self.configuration.update({"web": {"type": "none"}}, "test")
|
||||
self.application = Application(self.configuration)
|
||||
def test_none(self) -> None:
|
||||
self.configure({"web": {"type": "none"}})
|
||||
_, answer = self.get("/.web")
|
||||
assert answer
|
||||
self.get("/.web/", check=404)
|
||||
_, headers, _ = self.request("GET", "/.web/", check=302)
|
||||
assert headers.get("Location") == "/.web"
|
||||
self.post("/.web", check=405)
|
||||
|
||||
def test_custom(self):
|
||||
def test_custom(self) -> None:
|
||||
"""Custom web plugin."""
|
||||
self.configuration.update({
|
||||
"web": {"type": "radicale.tests.custom.web"}}, "test")
|
||||
self.application = Application(self.configuration)
|
||||
self.configure({"web": {"type": "radicale.tests.custom.web"}})
|
||||
_, answer = self.get("/.web")
|
||||
assert answer == "custom"
|
||||
_, answer = self.post("/.web", "body content")
|
||||
assert answer == "echo:body content"
|
||||
|
|
61
radicale/types.py
Normal file
61
radicale/types.py
Normal file
|
@ -0,0 +1,61 @@
|
|||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2020 Unrud <unrud@outlook.com>
|
||||
#
|
||||
# This library is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This library is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with Radicale. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import contextlib
|
||||
import sys
|
||||
from typing import (Any, Callable, ContextManager, Iterator, List, Mapping,
|
||||
MutableMapping, Sequence, Tuple, TypeVar, Union)
|
||||
|
||||
WSGIResponseHeaders = Union[Mapping[str, str], Sequence[Tuple[str, str]]]
|
||||
WSGIResponse = Tuple[int, WSGIResponseHeaders, Union[None, str, bytes]]
|
||||
WSGIEnviron = Mapping[str, Any]
|
||||
WSGIStartResponse = Callable[[str, List[Tuple[str, str]]], Any]
|
||||
|
||||
CONFIG = Mapping[str, Mapping[str, Any]]
|
||||
MUTABLE_CONFIG = MutableMapping[str, MutableMapping[str, Any]]
|
||||
CONFIG_SCHEMA = Mapping[str, Mapping[str, Any]]
|
||||
|
||||
_T = TypeVar("_T")
|
||||
|
||||
|
||||
def contextmanager(func: Callable[..., Iterator[_T]]
|
||||
) -> Callable[..., ContextManager[_T]]:
|
||||
"""Compatibility wrapper for `contextlib.contextmanager` with
|
||||
`typeguard`"""
|
||||
result = contextlib.contextmanager(func)
|
||||
result.__annotations__ = {**func.__annotations__,
|
||||
"return": ContextManager[_T]}
|
||||
return result
|
||||
|
||||
|
||||
if sys.version_info >= (3, 8):
|
||||
from typing import Protocol, runtime_checkable
|
||||
|
||||
@runtime_checkable
|
||||
class InputStream(Protocol):
|
||||
def read(self, size: int = ...) -> bytes: ...
|
||||
|
||||
@runtime_checkable
|
||||
class ErrorStream(Protocol):
|
||||
def flush(self) -> None: ...
|
||||
def write(self, s: str) -> None: ...
|
||||
else:
|
||||
ErrorStream = Any
|
||||
InputStream = Any
|
||||
|
||||
from radicale import item, storage # noqa:E402 isort:skip
|
||||
|
||||
CollectionOrItem = Union[item.Item, storage.BaseCollection]
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2014 Jean-Marc Martins
|
||||
# Copyright © 2012-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
|
@ -16,13 +16,25 @@
|
|||
# You should have received a copy of the GNU General Public License
|
||||
# along with Radicale. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import sys
|
||||
from importlib import import_module
|
||||
from typing import Callable, Sequence, Type, TypeVar, Union
|
||||
|
||||
from radicale import config
|
||||
from radicale.log import logger
|
||||
|
||||
if sys.version_info < (3, 8):
|
||||
import pkg_resources
|
||||
else:
|
||||
from importlib import metadata
|
||||
|
||||
def load_plugin(internal_types, module_name, class_name, configuration):
|
||||
type_ = configuration.get(module_name, "type")
|
||||
_T_co = TypeVar("_T_co", covariant=True)
|
||||
|
||||
|
||||
def load_plugin(internal_types: Sequence[str], module_name: str,
|
||||
class_name: str, base_class: Type[_T_co],
|
||||
configuration: "config.Configuration") -> _T_co:
|
||||
type_: Union[str, Callable] = configuration.get(module_name, "type")
|
||||
if callable(type_):
|
||||
logger.info("%s type is %r", module_name, type_)
|
||||
return type_(configuration)
|
||||
|
@ -37,3 +49,9 @@ def load_plugin(internal_types, module_name, class_name, configuration):
|
|||
(module_name, module, e)) from e
|
||||
logger.info("%s type is %r", module_name, module)
|
||||
return class_(configuration)
|
||||
|
||||
|
||||
def package_version(name):
|
||||
if sys.version_info < (3, 8):
|
||||
return pkg_resources.get_distribution(name).version
|
||||
return metadata.version(name)
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
#
|
||||
# This library is free software: you can redistribute it and/or modify
|
||||
|
@ -21,18 +21,24 @@ Take a look at the class ``BaseWeb`` if you want to implement your own.
|
|||
|
||||
"""
|
||||
|
||||
from radicale import utils
|
||||
from typing import Sequence
|
||||
|
||||
INTERNAL_TYPES = ("none", "internal")
|
||||
from radicale import config, httputils, types, utils
|
||||
|
||||
INTERNAL_TYPES: Sequence[str] = ("none", "internal")
|
||||
|
||||
|
||||
def load(configuration):
|
||||
def load(configuration: "config.Configuration") -> "BaseWeb":
|
||||
"""Load the web module chosen in configuration."""
|
||||
return utils.load_plugin(INTERNAL_TYPES, "web", "Web", configuration)
|
||||
return utils.load_plugin(INTERNAL_TYPES, "web", "Web", BaseWeb,
|
||||
configuration)
|
||||
|
||||
|
||||
class BaseWeb:
|
||||
def __init__(self, configuration):
|
||||
|
||||
configuration: "config.Configuration"
|
||||
|
||||
def __init__(self, configuration: "config.Configuration") -> None:
|
||||
"""Initialize BaseWeb.
|
||||
|
||||
``configuration`` see ``radicale.config`` module.
|
||||
|
@ -42,7 +48,8 @@ class BaseWeb:
|
|||
"""
|
||||
self.configuration = configuration
|
||||
|
||||
def get(self, environ, base_prefix, path, user):
|
||||
def get(self, environ: types.WSGIEnviron, base_prefix: str, path: str,
|
||||
user: str) -> types.WSGIResponse:
|
||||
"""GET request.
|
||||
|
||||
``base_prefix`` is sanitized and never ends with "/".
|
||||
|
@ -52,4 +59,20 @@ class BaseWeb:
|
|||
``user`` is empty for anonymous users.
|
||||
|
||||
"""
|
||||
raise NotImplementedError
|
||||
return httputils.METHOD_NOT_ALLOWED
|
||||
|
||||
def post(self, environ: types.WSGIEnviron, base_prefix: str, path: str,
|
||||
user: str) -> types.WSGIResponse:
|
||||
"""POST request.
|
||||
|
||||
``base_prefix`` is sanitized and never ends with "/".
|
||||
|
||||
``path`` is sanitized and always starts with "/.web"
|
||||
|
||||
``user`` is empty for anonymous users.
|
||||
|
||||
Use ``httputils.read*_request_body(self.configuration, environ)`` to
|
||||
read the body.
|
||||
|
||||
"""
|
||||
return httputils.METHOD_NOT_ALLOWED
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
#
|
||||
# This library is free software: you can redistribute it and/or modify
|
||||
|
@ -25,67 +25,15 @@ Features:
|
|||
|
||||
"""
|
||||
|
||||
from radicale import httputils, types, web
|
||||
|
||||
import os
|
||||
import posixpath
|
||||
import time
|
||||
from http import client
|
||||
|
||||
import pkg_resources
|
||||
|
||||
from radicale import httputils, pathutils, web
|
||||
from radicale.log import logger
|
||||
|
||||
MIMETYPES = {
|
||||
".css": "text/css",
|
||||
".eot": "application/vnd.ms-fontobject",
|
||||
".gif": "image/gif",
|
||||
".html": "text/html",
|
||||
".js": "application/javascript",
|
||||
".manifest": "text/cache-manifest",
|
||||
".png": "image/png",
|
||||
".svg": "image/svg+xml",
|
||||
".ttf": "application/font-sfnt",
|
||||
".txt": "text/plain",
|
||||
".woff": "application/font-woff",
|
||||
".woff2": "font/woff2",
|
||||
".xml": "text/xml"}
|
||||
FALLBACK_MIMETYPE = "application/octet-stream"
|
||||
MIMETYPES = httputils.MIMETYPES # deprecated
|
||||
FALLBACK_MIMETYPE = httputils.FALLBACK_MIMETYPE # deprecated
|
||||
|
||||
|
||||
class Web(web.BaseWeb):
|
||||
def __init__(self, configuration):
|
||||
super().__init__(configuration)
|
||||
self.folder = pkg_resources.resource_filename(__name__,
|
||||
"internal_data")
|
||||
|
||||
def get(self, environ, base_prefix, path, user):
|
||||
assert path == "/.web" or path.startswith("/.web/")
|
||||
assert pathutils.sanitize_path(path) == path
|
||||
try:
|
||||
filesystem_path = pathutils.path_to_filesystem(
|
||||
self.folder, path[len("/.web"):].strip("/"))
|
||||
except ValueError as e:
|
||||
logger.debug("Web content with unsafe path %r requested: %s",
|
||||
path, e, exc_info=True)
|
||||
return httputils.NOT_FOUND
|
||||
if os.path.isdir(filesystem_path) and not path.endswith("/"):
|
||||
location = posixpath.basename(path) + "/"
|
||||
return (client.FOUND,
|
||||
{"Location": location, "Content-Type": "text/plain"},
|
||||
"Redirected to %s" % location)
|
||||
if os.path.isdir(filesystem_path):
|
||||
filesystem_path = os.path.join(filesystem_path, "index.html")
|
||||
if not os.path.isfile(filesystem_path):
|
||||
return httputils.NOT_FOUND
|
||||
content_type = MIMETYPES.get(
|
||||
os.path.splitext(filesystem_path)[1].lower(), FALLBACK_MIMETYPE)
|
||||
with open(filesystem_path, "rb") as f:
|
||||
answer = f.read()
|
||||
last_modified = time.strftime(
|
||||
"%a, %d %b %Y %H:%M:%S GMT",
|
||||
time.gmtime(os.fstat(f.fileno()).st_mtime))
|
||||
headers = {
|
||||
"Content-Type": content_type,
|
||||
"Last-Modified": last_modified}
|
||||
return client.OK, headers, answer
|
||||
def get(self, environ: types.WSGIEnviron, base_prefix: str, path: str,
|
||||
user: str) -> types.WSGIResponse:
|
||||
return httputils.serve_resource("radicale.web", "internal_data",
|
||||
base_prefix, path)
|
||||
|
|
|
@ -21,15 +21,14 @@
|
|||
* @const
|
||||
* @type {string}
|
||||
*/
|
||||
const SERVER = (location.protocol + '//' + location.hostname +
|
||||
(location.port ? ':' + location.port : ''));
|
||||
const SERVER = location.origin;
|
||||
|
||||
/**
|
||||
* Path of the root collection on the server (must end with /)
|
||||
* @const
|
||||
* @type {string}
|
||||
*/
|
||||
const ROOT_PATH = location.pathname.replace(new RegExp("/+[^/]+/*(/index\\.html?)?$"), "") + '/';
|
||||
const ROOT_PATH = (new URL("..", location.href)).pathname;
|
||||
|
||||
/**
|
||||
* Regex to match and normalize color
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
#
|
||||
# This library is free software: you can redistribute it and/or modify
|
||||
|
@ -21,13 +21,15 @@ A dummy web backend that shows a simple message.
|
|||
|
||||
from http import client
|
||||
|
||||
from radicale import httputils, pathutils, web
|
||||
from radicale import httputils, pathutils, types, web
|
||||
|
||||
|
||||
class Web(web.BaseWeb):
|
||||
def get(self, environ, base_prefix, path, user):
|
||||
|
||||
def get(self, environ: types.WSGIEnviron, base_prefix: str, path: str,
|
||||
user: str) -> types.WSGIResponse:
|
||||
assert path == "/.web" or path.startswith("/.web/")
|
||||
assert pathutils.sanitize_path(path) == path
|
||||
if path != "/.web":
|
||||
return httputils.NOT_FOUND
|
||||
return httputils.redirect(base_prefix + "/.web")
|
||||
return client.OK, {"Content-Type": "text/plain"}, "Radicale works!"
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2008 Nicolas Kandel
|
||||
# Copyright © 2008 Pascal Halter
|
||||
# Copyright © 2008-2015 Guillaume Ayoub
|
||||
|
@ -26,20 +26,21 @@ import copy
|
|||
import xml.etree.ElementTree as ET
|
||||
from collections import OrderedDict
|
||||
from http import client
|
||||
from typing import Dict, Mapping, Optional
|
||||
from urllib.parse import quote
|
||||
|
||||
from radicale import pathutils
|
||||
from radicale import item, pathutils
|
||||
|
||||
MIMETYPES = {
|
||||
MIMETYPES: Mapping[str, str] = {
|
||||
"VADDRESSBOOK": "text/vcard",
|
||||
"VCALENDAR": "text/calendar"}
|
||||
|
||||
OBJECT_MIMETYPES = {
|
||||
OBJECT_MIMETYPES: Mapping[str, str] = {
|
||||
"VCARD": "text/vcard",
|
||||
"VLIST": "text/x-vlist",
|
||||
"VCALENDAR": "text/calendar"}
|
||||
|
||||
NAMESPACES = {
|
||||
NAMESPACES: Mapping[str, str] = {
|
||||
"C": "urn:ietf:params:xml:ns:caldav",
|
||||
"CR": "urn:ietf:params:xml:ns:carddav",
|
||||
"D": "DAV:",
|
||||
|
@ -48,15 +49,15 @@ NAMESPACES = {
|
|||
"ME": "http://me.com/_namespace/",
|
||||
"RADICALE": "http://radicale.org/ns/"}
|
||||
|
||||
NAMESPACES_REV = {}
|
||||
NAMESPACES_REV: Mapping[str, str] = {v: k for k, v in NAMESPACES.items()}
|
||||
|
||||
for short, url in NAMESPACES.items():
|
||||
NAMESPACES_REV[url] = short
|
||||
ET.register_namespace("" if short == "D" else short, url)
|
||||
|
||||
|
||||
def pretty_xml(element):
|
||||
def pretty_xml(element: ET.Element) -> str:
|
||||
"""Indent an ElementTree ``element`` and its children."""
|
||||
def pretty_xml_recursive(element, level):
|
||||
def pretty_xml_recursive(element: ET.Element, level: int) -> None:
|
||||
indent = "\n" + level * " "
|
||||
if len(element) > 0:
|
||||
if not (element.text or "").strip():
|
||||
|
@ -74,7 +75,7 @@ def pretty_xml(element):
|
|||
return '<?xml version="1.0"?>\n%s' % ET.tostring(element, "unicode")
|
||||
|
||||
|
||||
def make_clark(human_tag):
|
||||
def make_clark(human_tag: str) -> str:
|
||||
"""Get XML Clark notation from human tag ``human_tag``.
|
||||
|
||||
If ``human_tag`` is already in XML Clark notation it is returned as-is.
|
||||
|
@ -88,13 +89,13 @@ def make_clark(human_tag):
|
|||
ns_prefix, tag = human_tag.split(":", maxsplit=1)
|
||||
if not ns_prefix or not tag:
|
||||
raise ValueError("Invalid XML tag: %r" % human_tag)
|
||||
ns = NAMESPACES.get(ns_prefix)
|
||||
ns = NAMESPACES.get(ns_prefix, "")
|
||||
if not ns:
|
||||
raise ValueError("Unknown XML namespace prefix: %r" % human_tag)
|
||||
return "{%s}%s" % (ns, tag)
|
||||
|
||||
|
||||
def make_human_tag(clark_tag):
|
||||
def make_human_tag(clark_tag: str) -> str:
|
||||
"""Replace known namespaces in XML Clark notation ``clark_tag`` with
|
||||
prefix.
|
||||
|
||||
|
@ -111,31 +112,31 @@ def make_human_tag(clark_tag):
|
|||
ns, tag = clark_tag[len("{"):].split("}", maxsplit=1)
|
||||
if not ns or not tag:
|
||||
raise ValueError("Invalid XML tag: %r" % clark_tag)
|
||||
ns_prefix = NAMESPACES_REV.get(ns)
|
||||
ns_prefix = NAMESPACES_REV.get(ns, "")
|
||||
if ns_prefix:
|
||||
return "%s:%s" % (ns_prefix, tag)
|
||||
return clark_tag
|
||||
|
||||
|
||||
def make_response(code):
|
||||
def make_response(code: int) -> str:
|
||||
"""Return full W3C names from HTTP status codes."""
|
||||
return "HTTP/1.1 %i %s" % (code, client.responses[code])
|
||||
|
||||
|
||||
def make_href(base_prefix, href):
|
||||
def make_href(base_prefix: str, href: str) -> str:
|
||||
"""Return prefixed href."""
|
||||
assert href == pathutils.sanitize_path(href)
|
||||
return quote("%s%s" % (base_prefix, href))
|
||||
|
||||
|
||||
def webdav_error(human_tag):
|
||||
def webdav_error(human_tag: str) -> ET.Element:
|
||||
"""Generate XML error message."""
|
||||
root = ET.Element(make_clark("D:error"))
|
||||
root.append(ET.Element(make_clark(human_tag)))
|
||||
return root
|
||||
|
||||
|
||||
def get_content_type(item, encoding):
|
||||
def get_content_type(item: "item.Item", encoding: str) -> str:
|
||||
"""Get the content-type of an item with charset and component parameters.
|
||||
"""
|
||||
mimetype = OBJECT_MIMETYPES[item.name]
|
||||
|
@ -146,36 +147,47 @@ def get_content_type(item, encoding):
|
|||
return content_type
|
||||
|
||||
|
||||
def props_from_request(xml_request, actions=("set", "remove")):
|
||||
"""Return a list of properties as a dictionary."""
|
||||
result = OrderedDict()
|
||||
def props_from_request(xml_request: Optional[ET.Element]
|
||||
) -> Dict[str, Optional[str]]:
|
||||
"""Return a list of properties as a dictionary.
|
||||
|
||||
Properties that should be removed are set to `None`.
|
||||
|
||||
"""
|
||||
result: OrderedDict = OrderedDict()
|
||||
if xml_request is None:
|
||||
return result
|
||||
|
||||
for action in actions:
|
||||
action_element = xml_request.find(make_clark("D:%s" % action))
|
||||
if action_element is not None:
|
||||
break
|
||||
else:
|
||||
action_element = xml_request
|
||||
|
||||
prop_element = action_element.find(make_clark("D:prop"))
|
||||
if prop_element is not None:
|
||||
for prop in prop_element:
|
||||
if prop.tag == make_clark("D:resourcetype"):
|
||||
# Requests can contain multipe <D:set> and <D:remove> elements.
|
||||
# Each of these elements must contain exactly one <D:prop> element which
|
||||
# can contain multpile properties.
|
||||
# The order of the elements in the document must be respected.
|
||||
props = []
|
||||
for element in xml_request:
|
||||
if element.tag in (make_clark("D:set"), make_clark("D:remove")):
|
||||
for prop in element.findall("./%s/*" % make_clark("D:prop")):
|
||||
props.append((element.tag == make_clark("D:set"), prop))
|
||||
for is_set, prop in props:
|
||||
key = make_human_tag(prop.tag)
|
||||
value = None
|
||||
if prop.tag == make_clark("D:resourcetype"):
|
||||
key = "tag"
|
||||
if is_set:
|
||||
for resource_type in prop:
|
||||
if resource_type.tag == make_clark("C:calendar"):
|
||||
result["tag"] = "VCALENDAR"
|
||||
value = "VCALENDAR"
|
||||
break
|
||||
if resource_type.tag == make_clark("CR:addressbook"):
|
||||
result["tag"] = "VADDRESSBOOK"
|
||||
value = "VADDRESSBOOK"
|
||||
break
|
||||
elif prop.tag == make_clark("C:supported-calendar-component-set"):
|
||||
result[make_human_tag(prop.tag)] = ",".join(
|
||||
supported_comp.attrib["name"]
|
||||
for supported_comp in prop
|
||||
elif prop.tag == make_clark("C:supported-calendar-component-set"):
|
||||
if is_set:
|
||||
value = ",".join(
|
||||
supported_comp.attrib["name"] for supported_comp in prop
|
||||
if supported_comp.tag == make_clark("C:comp"))
|
||||
else:
|
||||
result[make_human_tag(prop.tag)] = prop.text
|
||||
elif is_set:
|
||||
value = prop.text or ""
|
||||
result[key] = value
|
||||
result.move_to_end(key)
|
||||
|
||||
return result
|
||||
|
|
2
rights
2
rights
|
@ -18,7 +18,7 @@
|
|||
#collection:
|
||||
#permissions: R
|
||||
|
||||
# Allow reading and writing principal collection (same as user name)
|
||||
# Allow reading and writing principal collection (same as username)
|
||||
#[principal]
|
||||
#user: .+
|
||||
#collection: {user}
|
||||
|
|
41
setup.cfg
41
setup.cfg
|
@ -1,17 +1,42 @@
|
|||
[aliases]
|
||||
test = pytest
|
||||
|
||||
[bdist_wheel]
|
||||
python-tag = py3
|
||||
|
||||
[tool:pytest]
|
||||
addopts = --flake8 --isort --cov --cov-report=term --cov-report=xml -r s
|
||||
norecursedirs = dist .cache .git build Radicale.egg-info .eggs venv
|
||||
addopts = --typeguard-packages=radicale
|
||||
|
||||
[tox:tox]
|
||||
|
||||
[testenv]
|
||||
extras = test
|
||||
deps =
|
||||
flake8
|
||||
isort
|
||||
# mypy installation fails with pypy<3.9
|
||||
mypy; implementation_name!='pypy' or python_version>='3.9'
|
||||
types-setuptools
|
||||
pytest-cov
|
||||
commands =
|
||||
flake8 .
|
||||
isort --check --diff .
|
||||
# Run mypy if it's installed
|
||||
python -c 'import importlib.util, subprocess, sys; \
|
||||
importlib.util.find_spec("mypy") \
|
||||
and sys.exit(subprocess.run(["mypy", "."]).returncode) \
|
||||
or print("Skipped: mypy is not installed")'
|
||||
pytest -r s --cov --cov-report=term --cov-report=xml .
|
||||
|
||||
[tool:isort]
|
||||
known_standard_library = _dummy_thread,_thread,abc,aifc,argparse,array,ast,asynchat,asyncio,asyncore,atexit,audioop,base64,bdb,binascii,binhex,bisect,builtins,bz2,cProfile,calendar,cgi,cgitb,chunk,cmath,cmd,code,codecs,codeop,collections,colorsys,compileall,concurrent,configparser,contextlib,contextvars,copy,copyreg,crypt,csv,ctypes,curses,dataclasses,datetime,dbm,decimal,difflib,dis,distutils,doctest,dummy_threading,email,encodings,ensurepip,enum,errno,faulthandler,fcntl,filecmp,fileinput,fnmatch,formatter,fpectl,fractions,ftplib,functools,gc,getopt,getpass,gettext,glob,grp,gzip,hashlib,heapq,hmac,html,http,imaplib,imghdr,imp,importlib,inspect,io,ipaddress,itertools,json,keyword,lib2to3,linecache,locale,logging,lzma,macpath,mailbox,mailcap,marshal,math,mimetypes,mmap,modulefinder,msilib,msvcrt,multiprocessing,netrc,nis,nntplib,ntpath,numbers,operator,optparse,os,ossaudiodev,parser,pathlib,pdb,pickle,pickletools,pipes,pkgutil,platform,plistlib,poplib,posix,posixpath,pprint,profile,pstats,pty,pwd,py_compile,pyclbr,pydoc,queue,quopri,random,re,readline,reprlib,resource,rlcompleter,runpy,sched,secrets,select,selectors,shelve,shlex,shutil,signal,site,smtpd,smtplib,sndhdr,socket,socketserver,spwd,sqlite3,sre,sre_compile,sre_constants,sre_parse,ssl,stat,statistics,string,stringprep,struct,subprocess,sunau,symbol,symtable,sys,sysconfig,syslog,tabnanny,tarfile,telnetlib,tempfile,termios,test,textwrap,threading,time,timeit,tkinter,token,tokenize,trace,traceback,tracemalloc,tty,turtle,turtledemo,types,typing,unicodedata,unittest,urllib,uu,uuid,venv,warnings,wave,weakref,webbrowser,winreg,winsound,wsgiref,xdrlib,xml,xmlrpc,zipapp,zipfile,zipimport,zlib
|
||||
known_third_party = defusedxml,passlib,pkg_resources,pytest,vobject
|
||||
|
||||
[flake8]
|
||||
# Only enable default tests (https://github.com/PyCQA/flake8/issues/790#issuecomment-812823398)
|
||||
select = E,F,W,C90,DOES-NOT-EXIST
|
||||
ignore = E121,E123,E126,E226,E24,E704,W503,W504,DOES-NOT-EXIST
|
||||
extend-exclude = build
|
||||
|
||||
[mypy]
|
||||
ignore_missing_imports = True
|
||||
show_error_codes = True
|
||||
exclude = (^|/)build($|/)
|
||||
|
||||
[coverage:run]
|
||||
branch = True
|
||||
source = radicale
|
||||
|
|
64
setup.py
64
setup.py
|
@ -1,6 +1,4 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
# This file is part of Radicale Server - Calendar Server
|
||||
# This file is part of Radicale - CalDAV and CardDAV server
|
||||
# Copyright © 2009-2017 Guillaume Ayoub
|
||||
# Copyright © 2017-2018 Unrud <unrud@outlook.com>
|
||||
#
|
||||
|
@ -17,66 +15,45 @@
|
|||
# You should have received a copy of the GNU General Public License
|
||||
# along with Radicale. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
"""
|
||||
Radicale CalDAV and CardDAV server
|
||||
==================================
|
||||
|
||||
The Radicale Project is a CalDAV (calendar) and CardDAV (contact) server. It
|
||||
aims to be a light solution, easy to use, easy to install, easy to configure.
|
||||
As a consequence, it requires few software dependances and is pre-configured to
|
||||
work out-of-the-box.
|
||||
|
||||
The Radicale Project runs on most of the UNIX-like platforms (Linux, BSD,
|
||||
MacOS X) and Windows. It is known to work with Evolution, Lightning, iPhone
|
||||
and Android clients. It is free and open-source software, released under GPL
|
||||
version 3.
|
||||
|
||||
For further information, please visit the `Radicale Website
|
||||
<https://radicale.org/>`_.
|
||||
|
||||
"""
|
||||
|
||||
import sys
|
||||
|
||||
from setuptools import find_packages, setup
|
||||
|
||||
# When the version is updated, a new section in the NEWS.md file must be
|
||||
# When the version is updated, a new section in the CHANGELOG.md file must be
|
||||
# added too.
|
||||
VERSION = "master"
|
||||
WEB_FILES = ["web/internal_data/css/icon.png",
|
||||
|
||||
with open("README.md", encoding="utf-8") as f:
|
||||
long_description = f.read()
|
||||
web_files = ["web/internal_data/css/icon.png",
|
||||
"web/internal_data/css/main.css",
|
||||
"web/internal_data/fn.js",
|
||||
"web/internal_data/index.html"]
|
||||
|
||||
needs_pytest = {"pytest", "test", "ptr"}.intersection(sys.argv)
|
||||
pytest_runner = ["pytest-runner"] if needs_pytest else []
|
||||
tests_require = ["pytest-runner", "pytest", "pytest-cov", "pytest-flake8",
|
||||
"pytest-isort", "waitress"]
|
||||
install_requires = ["defusedxml", "passlib", "vobject>=0.9.6",
|
||||
"python-dateutil>=2.7.3", "pika>=1.1.0",
|
||||
"setuptools; python_version<'3.9'"]
|
||||
bcrypt_requires = ["passlib[bcrypt]", "bcrypt"]
|
||||
# typeguard requires pytest<7
|
||||
test_requires = ["pytest<7", "typeguard", "waitress", *bcrypt_requires]
|
||||
|
||||
setup(
|
||||
name="Radicale",
|
||||
version=VERSION,
|
||||
description="CalDAV and CardDAV Server",
|
||||
long_description=__doc__,
|
||||
long_description=long_description,
|
||||
long_description_content_type="text/markdown",
|
||||
author="Guillaume Ayoub",
|
||||
author_email="guillaume.ayoub@kozea.fr",
|
||||
url="https://radicale.org/",
|
||||
download_url=("https://pypi.python.org/packages/source/R/Radicale/"
|
||||
"Radicale-%s.tar.gz" % VERSION),
|
||||
license="GNU GPL v3",
|
||||
platforms="Any",
|
||||
packages=find_packages(
|
||||
exclude=["*.tests", "*.tests.*", "tests.*", "tests"]),
|
||||
package_data={"radicale": WEB_FILES},
|
||||
package_data={"radicale": [*web_files, "py.typed"]},
|
||||
entry_points={"console_scripts": ["radicale = radicale.__main__:run"]},
|
||||
install_requires=["defusedxml", "passlib", "vobject>=0.9.6",
|
||||
"python-dateutil>=2.7.3", "pika>=1.1.0"],
|
||||
setup_requires=pytest_runner,
|
||||
tests_require=tests_require,
|
||||
extras_require={"test": tests_require,
|
||||
"bcrypt": ["passlib[bcrypt]", "bcrypt"]},
|
||||
install_requires=install_requires,
|
||||
extras_require={"test": test_requires, "bcrypt": bcrypt_requires},
|
||||
keywords=["calendar", "addressbook", "CalDAV", "CardDAV"],
|
||||
python_requires=">=3.5.2",
|
||||
python_requires=">=3.6.0",
|
||||
classifiers=[
|
||||
"Development Status :: 5 - Production/Stable",
|
||||
"Environment :: Console",
|
||||
|
@ -86,8 +63,11 @@ setup(
|
|||
"License :: OSI Approved :: GNU General Public License (GPL)",
|
||||
"Operating System :: OS Independent",
|
||||
"Programming Language :: Python :: 3",
|
||||
"Programming Language :: Python :: 3.5",
|
||||
"Programming Language :: Python :: 3.6",
|
||||
"Programming Language :: Python :: 3.7",
|
||||
"Programming Language :: Python :: 3.8",
|
||||
"Programming Language :: Python :: 3.9",
|
||||
"Programming Language :: Python :: 3.10",
|
||||
"Programming Language :: Python :: Implementation :: CPython",
|
||||
"Programming Language :: Python :: Implementation :: PyPy",
|
||||
"Topic :: Office/Business :: Groupware"])
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue