Attempt to resolve merge conflict

This commit is contained in:
rinpatch 2018-12-01 18:12:27 +03:00
commit fe2759bc9f
3285 changed files with 11853 additions and 5745 deletions

5
.gitignore vendored
View file

@ -6,6 +6,9 @@
/uploads
/test/uploads
/.elixir_ls
/test/fixtures/test_tmp.txt
/test/fixtures/image_tmp.jpg
/doc
# Prevent committing custom emojis
/priv/static/emoji/custom/*
@ -28,4 +31,4 @@ erl_crash.dump
.env
# Editor config
/.vscode
/.vscode

View file

@ -1,4 +1,4 @@
image: elixir:1.6.4
image: elixir:1.7.2
services:
- postgres:9.6.2
@ -9,6 +9,11 @@ variables:
POSTGRES_PASSWORD: postgres
DB_HOST: postgres
cache:
key: ${CI_COMMIT_REF_SLUG}
paths:
- deps
- _build
stages:
- lint
- test

View file

@ -1,106 +0,0 @@
# Configuring Pleroma
In the `config/` directory, you will find the following relevant files:
* `config.exs`: default base configuration
* `dev.exs`: default additional configuration for `MIX_ENV=dev`
* `prod.exs`: default additional configuration for `MIX_ENV=prod`
Do not modify files in the list above.
Instead, overload the settings by editing the following files:
* `dev.secret.exs`: custom additional configuration for `MIX_ENV=dev`
* `prod.secret.exs`: custom additional configuration for `MIX_ENV=prod`
## Uploads configuration
To configure where to upload files, and wether or not
you want to remove automatically EXIF data from pictures
being uploaded.
config :pleroma, Pleroma.Upload,
uploads: "uploads",
strip_exif: false
* `uploads`: where to put the uploaded files, relative to pleroma's main directory.
* `strip_exif`: whether or not to remove EXIF data from uploaded pics automatically.
This needs Imagemagick installed on the system ( apt install imagemagick ).
## Block functionality
config :pleroma, :activitypub,
accept_blocks: true,
unfollow_blocked: true,
outgoing_blocks: true
config :pleroma, :user, deny_follow_blocked: true
* `accept_blocks`: whether to accept incoming block activities from
other instances
* `unfollow_blocked`: whether blocks result in people getting
unfollowed
* `outgoing_blocks`: whether to federate blocks to other instances
* `deny_follow_blocked`: whether to disallow following an account that
has blocked the user in question
## Message Rewrite Filters (MRFs)
Modify incoming and outgoing posts.
config :pleroma, :instance,
rewrite_policy: Pleroma.Web.ActivityPub.MRF.NoOpPolicy
`rewrite_policy` specifies which MRF policies to apply.
It can either be a single policy or a list of policies.
Currently, MRFs availible by default are:
* `Pleroma.Web.ActivityPub.MRF.NoOpPolicy`
* `Pleroma.Web.ActivityPub.MRF.DropPolicy`
* `Pleroma.Web.ActivityPub.MRF.SimplePolicy`
* `Pleroma.Web.ActivityPub.MRF.RejectNonPublic`
Some policies, such as SimplePolicy and RejectNonPublic,
can be additionally configured in their respective sections.
### NoOpPolicy
Does not modify posts (this is the default `rewrite_policy`)
### DropPolicy
Drops all posts.
It generally does not make sense to use this in production.
### SimplePolicy
Restricts the visibility of posts from certain instances.
config :pleroma, :mrf_simple,
media_removal: [],
media_nsfw: [],
federated_timeline_removal: [],
reject: [],
accept: []
* `media_removal`: posts from these instances will have attachments
removed
* `media_nsfw`: posts from these instances will have attachments marked
as nsfw
* `federated_timeline_removal`: posts from these instances will be
marked as unlisted
* `reject`: posts from these instances will be dropped
* `accept`: if not empty, only posts from these instances will be accepted
### RejectNonPublic
Drops posts with non-public visibility settings.
config :pleroma :mrf_rejectnonpublic
allow_followersonly: false,
allow_direct: false,
* `allow_followersonly`: whether to allow follower-only posts through
the filter
* `allow_direct`: whether to allow direct messages through the filter

View file

@ -2,11 +2,13 @@
## About Pleroma
Pleroma is an OStatus-compatible social networking server written in Elixir, compatible with GNU Social and Mastodon. It is high-performance and can run on small devices like a Raspberry Pi.
Pleroma is a microblogging server software that can federate (= exchange messages with) other servers that support the same federation standards (OStatus and ActivityPub). What that means is that you can host a server for yourself or your friends and stay in control of your online identity, but still exchange messages with people on larger servers. Pleroma will federate with all servers that implement either OStatus or ActivityPub, like Friendica, GNU Social, Hubzilla, Mastodon, Misskey, Peertube, and Pixelfed.
Pleroma is written in Elixir, high-performance and can run on small devices like a Raspberry Pi.
For clients it supports both the [GNU Social API with Qvitter extensions](https://twitter-api.readthedocs.io/en/latest/index.html) and the [Mastodon client API](https://github.com/tootsuite/documentation/blob/master/Using-the-API/API.md).
Mobile clients that are known to work well:
Client applications that are known to work well:
* Twidere
* Tusky
@ -15,6 +17,7 @@ Mobile clients that are known to work well:
* Amaroq (iOS)
* Tootdon (Android + iOS)
* Tootle (iOS)
* Whalebird (Windows + Mac + Linux)
No release has been made yet, but several servers have been online for months already. If you want to run your own server, feel free to contact us at @lain@pleroma.soykaf.com or in our dev chat at #pleroma on freenode or via matrix at https://matrix.heldscal.la/#/room/#freenode_#pleroma:matrix.org.
@ -36,7 +39,7 @@ While we don't provide docker files, other people have written very good ones. T
* Run `mix pleroma.gen.instance`. This will ask you questions about your instance and generate a configuration file in `config/generated_config.exs`. Check that and copy it to either `config/dev.secret.exs` or `config/prod.secret.exs`. It will also create a `config/setup_db.psql`, which you should run as the PostgreSQL superuser (i.e., `sudo -u postgres psql -f config/setup_db.psql`). It will create the database, user, and password you gave `mix pleroma.gen.instance` earlier, as well as set up the necessary extensions in the database. PostgreSQL superuser privileges are only needed for this step.
* For these next steps, the default will be to run pleroma using the dev configuration file, `config/dev.secret.exs`. To run them using the prod config file, prefix each command at the shell with `MIX_ENV=prod`. For example: `MIX_ENV=prod mix phx.server`. You can also simply run `export MIX_ENV=prod` which will set this variable for the rest of the shell session.
* For these next steps, the default will be to run pleroma using the dev configuration file, `config/dev.secret.exs`. To run them using the prod config file, prefix each command at the shell with `MIX_ENV=prod`. For example: `MIX_ENV=prod mix phx.server`. Documentation for the config can be found at [``config/config.md``](config/config.md)
* Run `mix ecto.migrate` to run the database migrations. You will have to do this again after certain updates.
@ -44,8 +47,6 @@ While we don't provide docker files, other people have written very good ones. T
* The common and convenient way for adding HTTPS is by using Nginx as a reverse proxy. You can look at example Nginx configuration in `installation/pleroma.nginx`. If you need TLS/SSL certificates for HTTPS, you can look get some for free with letsencrypt: <https://letsencrypt.org/>. The simplest way to obtain and install a certificate is to use [Certbot.](https://certbot.eff.org) Depending on your specific setup, certbot may be able to get a certificate and configure your web server automatically.
* [Not tested with system reboot yet!] You'll also want to set up Pleroma to be run as a systemd service. Example .service file can be found in `installation/pleroma.service` you can put it in `/etc/systemd/system/`.
## Running
* By default, it listens on port 4000 (TCP), so you can access it on http://localhost:4000/ (if you are on the same machine). In case of an error it will restart automatically.
@ -54,9 +55,15 @@ While we don't provide docker files, other people have written very good ones. T
Pleroma comes with two frontends. The first one, Pleroma FE, can be reached by normally visiting the site. The other one, based on the Mastodon project, can be found by visiting the /web path of your site.
### As systemd service (with provided .service file)
Example .service file can be found in `installation/pleroma.service` you can put it in `/etc/systemd/system/`.
Running `service pleroma start`
Logs can be watched by using `journalctl -fu pleroma.service`
### As OpenRC service (with provided RC file)
Copy ``installation/init.d/pleroma`` to ``/etc/init.d/pleroma``.
You can add it to the services ran by default with:
``rc-update add pleroma``
### Standalone/run by other means
Run `mix phx.server` in repository's root, it will output log into stdout/stderr
@ -69,22 +76,6 @@ Add the following to your `dev.secret.exs` or `prod.secret.exs` if you want to p
This is useful for running pleroma inside Tor or i2p.
## Admin Tasks
### Register a User
Run `mix register_user <name> <nickname> <email> <bio> <password>`. The `name` appears on statuses, while the nickname corresponds to the user, e.g. `@nickname@instance.tld`
### Password reset
Run `mix generate_password_reset username` to generate a password reset link that you can then send to the user.
### Moderators
You can make users moderators. They will then be able to delete any post.
Run `mix set_moderator username [true|false]` to make user a moderator or not.
## Troubleshooting
### No incoming federation

View file

@ -10,21 +10,52 @@
config :pleroma, Pleroma.Repo, types: Pleroma.PostgresTypes
# Upload configuration
config :pleroma, Pleroma.Upload,
uploads: "uploads",
strip_exif: false
uploader: Pleroma.Uploaders.Local,
filters: [],
proxy_remote: false,
proxy_opts: []
config :pleroma, Pleroma.Uploaders.Local, uploads: "uploads"
config :pleroma, Pleroma.Uploaders.S3,
bucket: nil,
public_endpoint: "https://s3.amazonaws.com"
config :pleroma, Pleroma.Uploaders.MDII,
cgi: "https://mdii.sakura.ne.jp/mdii-post.cgi",
files: "https://mdii.sakura.ne.jp"
config :pleroma, :emoji, shortcode_globs: ["/emoji/custom/**/*.png"]
config :pleroma, :uri_schemes, additionnal_schemes: []
config :pleroma, :uri_schemes,
valid_schemes: [
"https",
"http",
"dat",
"dweb",
"gopher",
"ipfs",
"ipns",
"irc",
"ircs",
"magnet",
"mailto",
"mumble",
"ssb",
"xmpp"
]
# Configures the endpoint
config :pleroma, Pleroma.Web.Endpoint,
url: [host: "localhost"],
protocol: "https",
secret_key_base: "aK4Abxf29xU9TTDKre9coZPUgevcVCFQJe/5xP/7Lt4BEif6idBIbjupVbOrbKxl",
signing_salt: "CqaoopA2",
render_errors: [view: Pleroma.Web.ErrorView, accepts: ~w(json)],
pubsub: [name: Pleroma.PubSub, adapter: Phoenix.PubSub.PG2]
pubsub: [name: Pleroma.PubSub, adapter: Phoenix.PubSub.PG2],
secure_cookie_flag: true
# Configures Elixir's Logger
config :logger, :console,
@ -42,49 +73,70 @@
config :pleroma, :ostatus, Pleroma.Web.OStatus
config :pleroma, :httpoison, Pleroma.HTTP
version =
with {version, 0} <- System.cmd("git", ["rev-parse", "HEAD"]) do
"Pleroma #{Mix.Project.config()[:version]} #{String.trim(version)}"
else
_ -> "Pleroma #{Mix.Project.config()[:version]} dev"
end
# Configures http settings, upstream proxy etc.
config :pleroma, :http, proxy_url: nil
config :pleroma, :instance,
version: version,
name: "Pleroma",
email: "example@example.com",
description: "A Pleroma instance, an alternative fediverse server",
limit: 5000,
upload_limit: 16_000_000,
avatar_upload_limit: 2_000_000,
background_upload_limit: 4_000_000,
banner_upload_limit: 4_000_000,
registrations_open: true,
federating: true,
allow_relay: true,
rewrite_policy: Pleroma.Web.ActivityPub.MRF.NoOpPolicy,
public: true,
quarantined_instances: []
quarantined_instances: [],
managed_config: true,
allowed_post_formats: [
"text/plain",
"text/html",
"text/markdown"
],
finmoji_enabled: true,
mrf_transparency: true
config :pleroma, :markup,
# XXX - unfortunately, inline images must be enabled by default right now, because
# of custom emoji. Issue #275 discusses defanging that somehow.
allow_inline_images: true,
allow_headings: false,
allow_tables: false,
allow_fonts: false,
scrub_policy: [
Pleroma.HTML.Transform.MediaProxy,
Pleroma.HTML.Scrubber.Default
]
config :pleroma, :fe,
theme: "pleroma-dark",
logo: "/static/logo.png",
logo_mask: true,
logo_margin: "0.1em",
background: "/static/aurora_borealis.jpg",
redirect_root_no_login: "/main/all",
redirect_root_login: "/main/friends",
show_instance_panel: true,
show_who_to_follow_panel: false,
who_to_follow_provider:
"https://vinayaka.distsn.org/cgi-bin/vinayaka-user-match-osa-api.cgi?{{host}}+{{user}}",
who_to_follow_link: "https://vinayaka.distsn.org/?{{host}}+{{user}}",
scope_options_enabled: false
scope_options_enabled: false,
formatting_options_enabled: false,
collapse_message_with_subject: false,
hide_post_stats: false,
hide_user_stats: false
config :pleroma, :activitypub,
accept_blocks: true,
unfollow_blocked: true,
outgoing_blocks: true
outgoing_blocks: true,
follow_handshake_timeout: 500
config :pleroma, :user, deny_follow_blocked: true
config :pleroma, :mrf_normalize_markup, scrub_policy: Pleroma.HTML.Scrubber.Default
config :pleroma, :mrf_rejectnonpublic,
allow_followersonly: false,
allow_direct: false
@ -98,9 +150,11 @@
config :pleroma, :media_proxy,
enabled: false,
redirect_on_failure: true
# base_url: "https://cache.pleroma.social"
# base_url: "https://cache.pleroma.social",
proxy_opts: [
# inline_content_types: [] | false | true,
# http: [:insecure]
]
config :pleroma, :chat, enabled: true
@ -118,8 +172,30 @@
third_party_engine:
"http://vinayaka.distsn.org/cgi-bin/vinayaka-user-match-suggestions-api.cgi?{{host}}+{{user}}",
timeout: 300_000,
limit: 23,
web: "https://vinayaka.distsn.org/?{{host}}+{{user}}"
config :pleroma, :http_security,
enabled: true,
sts: false,
sts_max_age: 31_536_000,
ct_max_age: 2_592_000,
referrer_policy: "same-origin"
config :cors_plug,
max_age: 86_400,
methods: ["POST", "PUT", "DELETE", "GET", "PATCH", "OPTIONS"],
expose: [
"Link",
"X-RateLimit-Reset",
"X-RateLimit-Limit",
"X-RateLimit-Remaining",
"X-Request-Id",
"Idempotency-Key"
],
credentials: true,
headers: ["Authorization", "Content-Type", "Idempotency-Key"]
# Import environment specific config. This must remain at the bottom
# of this file so it overrides the configuration defined above.
import_config "#{Mix.env()}.exs"

111
config/config.md Normal file
View file

@ -0,0 +1,111 @@
# Configuration
This file describe the configuration, it is recommended to edit the relevant *.secret.exs file instead of the others founds in the ``config`` directory.
If you run Pleroma with ``MIX_ENV=prod`` the file is ``prod.secret.exs``, otherwise it is ``dev.secret.exs``.
## Pleroma.Upload
* `uploader`: Select which `Pleroma.Uploaders` to use
* `filters`: List of `Pleroma.Upload.Filter` to use.
* `base_url`: The base URL to access a user-uploaded file. Useful when you want to proxy the media files via another host.
* `proxy_remote`: If you're using a remote uploader, Pleroma will proxy media requests instead of redirecting to it.
* `proxy_opts`: Proxy options, see `Pleroma.ReverseProxy` documentation.
Note: `strip_exif` has been replaced by `Pleroma.Upload.Filter.Mogrify`.
## Pleroma.Uploaders.Local
* `uploads`: Which directory to store the user-uploads in, relative to pleromas working directory
## Pleroma.Upload.Filter.Mogrify
* `args`: List of actions for the `mogrify` command like `"strip"` or `["strip", {"impode", "1"}]`.
## :uri_schemes
* `valid_schemes`: List of the scheme part that is considered valid to be an URL
## :instance
* `name`: The instances name
* `email`: Email used to reach an Administrator/Moderator of the instance
* `description`: The instances description, can be seen in nodeinfo and ``/api/v1/instance``
* `limit`: Posts character limit (CW/Subject included in the counter)
* `upload_limit`: File size limit of uploads (except for avatar, background, banner)
* `avatar_upload_limit`: File size limit of users profile avatars
* `background_upload_limit`: File size limit of users profile backgrounds
* `banner_upload_limit`: File size limit of users profile backgrounds
* `registerations_open`: Enable registerations for anyone, invitations can be used when false.
* `federating`
* `allow_relay`: Enable Pleromas Relay, which makes it possible to follow a whole instance
* `rewrite_policy`: Message Rewrite Policy, either one or a list. Here are the ones available by default:
* `Pleroma.Web.ActivityPub.MRF.NoOpPolicy`: Doesnt modify activities (default)
* `Pleroma.Web.ActivityPub.MRF.DropPolicy`: Drops all activities. It generally doesnt makes sense to use in production
* `Pleroma.Web.ActivityPub.MRF.SimplePolicy`: Restrict the visibility of activities from certains instances (See ``:mrf_simple`` section)
* `Pleroma.Web.ActivityPub.MRF.RejectNonPublic`: Drops posts with non-public visibility settings (See ``:mrf_rejectnonpublic`` section)
* `public`: Makes the client API in authentificated mode-only except for user-profiles. Useful for disabling the Local Timeline and The Whole Known Network.
* `quarantined_instances`: List of ActivityPub instances where private(DMs, followers-only) activities will not be send.
* `managed_config`: Whenether the config for pleroma-fe is configured in this config or in ``static/config.json``
* `allowed_post_formats`: MIME-type list of formats allowed to be posted (transformed into HTML)
* `finmoji_enabled`: Whenether to enable the finmojis in the custom emojis.
* `mrf_transparency`: Make the content of your Message Rewrite Facility settings public (via nodeinfo).
## :fe
This section is used to configure Pleroma-FE, unless ``:managed_config`` in ``:instance`` is set to false.
* `theme`: Which theme to use, they are defined in ``styles.json``
* `logo`: URL of the logo, defaults to Pleromas logo
* `logo_mask`: Whenether to mask the logo
* `logo_margin`: What margin to use around the logo
* `background`: URL of the background, unless viewing a user profile with a background that is set
* `redirect_root_no_login`: relative URL which indicates where to redirect when a user isnt logged in.
* `redirect_root_login`: relative URL which indicates where to redirect when a user is logged in.
* `show_instance_panel`: Whenether to show the instances specific panel.
* `scope_options_enabled`: Enable setting an notice visibility and subject/CW when posting
* `formatting_options_enabled`: Enable setting a formatting different than plain-text (ie. HTML, Markdown) when posting, relates to ``:instance, allowed_post_formats``
* `collapse_message_with_subjects`: When a message has a subject(aka Content Warning), collapse it by default
* `hide_post_stats`: Hide notices statistics(repeats, favorites, …)
* `hide_user_stats`: Hide profile statistics(posts, posts per day, followers, followings, …)
## :mrf_simple
* `media_removal`: List of instances to remove medias from
* `media_nsfw`: List of instances to put medias as NSFW(sensitive) from
* `federated_timeline_removal`: List of instances to remove from Federated (aka The Whole Known Network) Timeline
* `reject`: List of instances to reject any activities from
* `accept`: List of instances to accept any activities from
## :mrf_rejectnonpublic
* `allow_followersonly`: whether to allow followers-only posts
* `allow_direct`: whether to allow direct messages
## :media_proxy
* `enabled`: Enables proxying of remote media to the instances proxy
* `base_url`: The base URL to access a user-uploaded file. Useful when you want to proxy the media files via another host/CDN fronts.
* `proxy_opts`: All options defined in `Pleroma.ReverseProxy` documentation, defaults to `[max_body_length: (25*1_048_576)]`.
## :gopher
* `enabled`: Enables the gopher interface
* `ip`: IP address to bind to
* `port`: Port to bind to
## :activitypub
* ``accept_blocks``: Whether to accept incoming block activities from other instances
* ``unfollow_blocked``: Whether blocks result in people getting unfollowed
* ``outgoing_blocks``: Whether to federate blocks to other instances
* ``deny_follow_blocked``: Whether to disallow following an account that has blocked the user in question
## :http_security
* ``enabled``: Whether the managed content security policy is enabled
* ``sts``: Whether to additionally send a `Strict-Transport-Security` header
* ``sts_max_age``: The maximum age for the `Strict-Transport-Security` header if sent
* ``ct_max_age``: The maximum age for the `Expect-CT` header if sent
* ``referrer_policy``: The referrer policy to use, either `"same-origin"` or `"no-referrer"`.
## :mrf_user_allowlist
The keys in this section are the domain names that the policy should apply to.
Each key should be assigned a list of users that should be allowed through by
their ActivityPub ID.
An example:
```
config :pleroma, :mrf_user_allowlist,
"example.org": ["https://example.org/users/admin"]
```

View file

@ -9,7 +9,7 @@
# Print only warnings and errors during test
config :logger, level: :warn
config :pleroma, Pleroma.Upload, uploads: "test/uploads"
config :pleroma, Pleroma.Uploaders.Local, uploads: "test/uploads"
# Configure your database
config :pleroma, Pleroma.Repo,

View file

@ -1,18 +1,31 @@
social.domain.tld {
tls user@domain.tld
# default Caddyfile config for Pleroma
#
# Simple installation instructions:
# 1. Replace 'example.tld' with your instance's domain wherever it appears.
# 2. Copy this section into your Caddyfile and restart Caddy.
log /var/log/caddy/pleroma.log
example.tld {
log /var/log/caddy/pleroma_access.log
errors /var/log/caddy/pleroma_error.log
cors / {
origin https://halcyon.domain.tld
origin https://pinafore.domain.tld
methods POST,PUT,DELETE,GET,PATCH,OPTIONS
allowed_headers Authorization,Content-Type,Idempotency-Key
exposed_headers Link,X-RateLimit-Reset,X-RateLimit-Limit,X-RateLimit-Remaining,X-Request-Id
}
gzip
proxy / localhost:4000 {
websocket
transparent
}
tls {
# Remove the rest of the lines in here, if you want to support older devices
key_type p256
ciphers ECDHE-ECDSA-WITH-CHACHA20-POLY1305 ECDHE-RSA-WITH-CHACHA20-POLY1305 ECDHE-ECDSA-AES256-GCM-SHA384 ECDHE-RSA-AES256-GCM-SHA384 ECDHE-ECDSA-AES128-GCM-SHA256 ECDHE-RSA-AES128-GCM-SHA256
}
# If you do not want to use the mediaproxy function, remove these lines.
# To use this directive, you need the http.cache plugin for Caddy.
cache {
match_path /proxy
default_max_age 720m
}
# Stop removing lines here.
}

21
installation/init.d/pleroma Executable file
View file

@ -0,0 +1,21 @@
#!/sbin/openrc-run
# Requires OpenRC >= 0.35
directory=~pleroma/pleroma
command=/usr/bin/mix
command_args="phx.server"
command_user=pleroma:pleroma
command_background=1
export PORT=4000
export MIX_ENV=prod
# Ask process to terminate within 30 seconds, otherwise kill it
retry="SIGTERM/30 SIGKILL/5"
pidfile="/var/run/pleroma.pid"
depend() {
need nginx postgresql
}

View file

@ -1,26 +1,54 @@
# default Apache site config for Pleroma
#
# needed modules: define headers proxy proxy_http proxy_wstunnel rewrite ssl
#
# Simple installation instructions:
# 1. Install your TLS certificate, possibly using Let's Encrypt.
# 2. Replace 'example.tld' with your instance's domain wherever it appears.
# 3. This assumes a Debian style Apache config. Copy this file to
# /etc/apache2/sites-available/ and then add a symlink to it in
# /etc/apache2/sites-enabled/ by running 'a2ensite pleroma-apache.conf', then restart Apache.
Define servername example.tld
ServerName ${servername}
ServerTokens Prod
ErrorLog ${APACHE_LOG_DIR}/error.log
CustomLog ${APACHE_LOG_DIR}/access.log combined
<VirtualHost *:80>
#Example configuration for when Apache httpd and Pleroma are on the same host.
#Needed modules: proxy proxy_http proxy_wstunnel rewrite
#This assumes a Debian style Apache config. Put this in /etc/apache2/sites-available
#Doesn't include SSL, just run certbot and let it take care of that.
#Change this:
ServerName pleroma.example.com
Redirect permanent / https://${servername}
</VirtualHost>
<VirtualHost *:443>
SSLEngine on
SSLCertificateFile /etc/letsencrypt/live/${servername}/cert.pem
SSLCertificateKeyFile /etc/letsencrypt/live/${servername}/privkey.pem
SSLCertificateChainFile /etc/letsencrypt/live/${servername}/fullchain.pem
# Mozilla modern configuration, tweak to your needs
SSLProtocol all -SSLv3 -TLSv1 -TLSv1.1
SSLCipherSuite ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256
SSLHonorCipherOrder on
SSLCompression off
SSLSessionTickets off
RewriteEngine On
RewriteCond %{HTTP:Connection} Upgrade [NC]
RewriteCond %{HTTP:Upgrade} websocket [NC]
RewriteRule /(.*) ws://localhost:4000/$1 [P,L]
ProxyRequests off
ProxyPass / http://localhost:4000/
ProxyPassReverse / http://localhost:4000/
#Change this too:
RequestHeader set Host "pleroma.example.com"
RequestHeader set Host ${servername}
ProxyPreserveHost On
ErrorLog ${APACHE_LOG_DIR}/error.log
CustomLog ${APACHE_LOG_DIR}/access.log combined
</VirtualHost>
# OCSP Stapling, only in httpd 2.3.3 and later
SSLUseStapling on
SSLStaplingResponderTimeout 5
SSLStaplingReturnResponderErrors off
SSLStaplingCache shmcb:/var/run/ocsp(128000)

View file

@ -10,8 +10,8 @@ proxy_cache_path /tmp/pleroma-media-cache levels=1:2 keys_zone=pleroma_media_cac
inactive=720m use_temp_path=off;
server {
listen 80;
server_name example.tld;
listen 80;
return 301 https://$server_name$request_uri;
# Uncomment this if you need to use the 'webroot' method with certbot. Make sure
@ -46,7 +46,7 @@ server {
ssl_ecdh_curve X25519:prime256v1:secp384r1:secp521r1;
ssl_stapling on;
ssl_stapling_verify on;
server_name example.tld;
gzip_vary on;
@ -60,27 +60,6 @@ server {
client_max_body_size 16m;
location / {
# if you do not want remote frontends to be able to access your Pleroma backend
# server, remove these lines.
add_header 'Access-Control-Allow-Origin' '*' always;
add_header 'Access-Control-Allow-Methods' 'POST, PUT, DELETE, GET, PATCH, OPTIONS' always;
add_header 'Access-Control-Allow-Headers' 'Authorization, Content-Type, Idempotency-Key' always;
add_header 'Access-Control-Expose-Headers' 'Link, X-RateLimit-Reset, X-RateLimit-Limit, X-RateLimit-Remaining, X-Request-Id' always;
if ($request_method = OPTIONS) {
return 204;
}
# stop removing lines here.
add_header X-XSS-Protection "1; mode=block";
add_header X-Permitted-Cross-Domain-Policies none;
add_header X-Frame-Options DENY;
add_header X-Content-Type-Options nosniff;
add_header Referrer-Policy same-origin;
add_header X-Download-Options noopen;
# Uncomment this only after you get HTTPS working.
# add_header Strict-Transport-Security "max-age=31536000; includeSubDomains";
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
@ -91,10 +70,12 @@ server {
client_max_body_size 16m;
}
location /proxy {
location ~ ^/(media|proxy) {
proxy_cache pleroma_media_cache;
proxy_cache_lock on;
proxy_ignore_client_abort on;
proxy_buffering off;
chunked_transfer_encoding on;
proxy_pass http://localhost:4000;
}
}

View file

@ -6,10 +6,21 @@ After=network.target postgresql.service
User=pleroma
WorkingDirectory=/home/pleroma/pleroma
Environment="HOME=/home/pleroma"
Environment="MIX_ENV=prod"
ExecStart=/usr/local/bin/mix phx.server
ExecReload=/bin/kill $MAINPID
KillMode=process
Restart=on-failure
; Some security directives.
; Use private /tmp and /var/tmp folders inside a new file system namespace, which are discarded after the process stops.
PrivateTmp=true
; Mount /usr, /boot, and /etc as read-only for processes invoked by this service.
ProtectSystem=full
; Sets up a new /dev mount for the process and only adds API pseudo devices like /dev/null, /dev/zero or /dev/random but not physical devices. Disabled by default because it may not work on devices like the Raspberry Pi.
PrivateDevices=false
; Ensures that the service process and all its children can never gain new privileges through execve().
NoNewPrivileges=true
[Install]
WantedBy=multi-user.target

View file

@ -0,0 +1,97 @@
defmodule Mix.Tasks.MigrateLocalUploads do
use Mix.Task
import Mix.Ecto
alias Pleroma.{Upload, Uploaders.Local, Uploaders.S3}
require Logger
@log_every 50
@shortdoc "Migrate uploads from local to remote storage"
def run([target_uploader | args]) do
delete? = Enum.member?(args, "--delete")
Application.ensure_all_started(:pleroma)
local_path = Pleroma.Config.get!([Local, :uploads])
uploader = Module.concat(Pleroma.Uploaders, target_uploader)
unless Code.ensure_loaded?(uploader) do
raise("The uploader #{inspect(uploader)} is not an existing/loaded module.")
end
target_enabled? = Pleroma.Config.get([Upload, :uploader]) == uploader
unless target_enabled? do
Pleroma.Config.put([Upload, :uploader], uploader)
end
Logger.info("Migrating files from local #{local_path} to #{to_string(uploader)}")
if delete? do
Logger.warn(
"Attention: uploaded files will be deleted, hope you have backups! (--delete ; cancel with ^C)"
)
:timer.sleep(:timer.seconds(5))
end
uploads =
File.ls!(local_path)
|> Enum.map(fn id ->
root_path = Path.join(local_path, id)
cond do
File.dir?(root_path) ->
files = for file <- File.ls!(root_path), do: {id, file, Path.join([root_path, file])}
case List.first(files) do
{id, file, path} ->
{%Pleroma.Upload{id: id, name: file, path: id <> "/" <> file, tempfile: path},
root_path}
_ ->
nil
end
File.exists?(root_path) ->
file = Path.basename(id)
[hash, ext] = String.split(id, ".")
{%Pleroma.Upload{id: hash, name: file, path: file, tempfile: root_path}, root_path}
true ->
nil
end
end)
|> Enum.filter(& &1)
total_count = length(uploads)
Logger.info("Found #{total_count} uploads")
uploads
|> Task.async_stream(
fn {upload, root_path} ->
case Upload.store(upload, uploader: uploader, filters: [], size_limit: nil) do
{:ok, _} ->
if delete?, do: File.rm_rf!(root_path)
Logger.debug("uploaded: #{inspect(upload.path)} #{inspect(upload)}")
:ok
error ->
Logger.error("failed to upload #{inspect(upload.path)}: #{inspect(error)}")
end
end,
timeout: 150_000
)
|> Stream.chunk_every(@log_every)
|> Enum.reduce(0, fn done, count ->
count = count + length(done)
Logger.info("Uploaded #{count}/#{total_count} files")
count
end)
Logger.info("Done!")
end
def run(_) do
Logger.error("Usage: migrate_local_uploads S3|Swift [--delete]")
end
end

View file

@ -28,3 +28,44 @@ config :pleroma, Pleroma.Repo,
database: "<%= dbname %>",
hostname: "<%= dbhost %>",
pool_size: 10
# Enable Strict-Transport-Security once SSL is working:
# config :pleroma, :http_security,
# sts: true
# Configure S3 support if desired.
# The public S3 endpoint is different depending on region and provider,
# consult your S3 provider's documentation for details on what to use.
#
# config :pleroma, Pleroma.Uploaders.S3,
# bucket: "some-bucket",
# public_endpoint: "https://s3.amazonaws.com"
#
# Configure S3 credentials:
# config :ex_aws, :s3,
# access_key_id: "xxxxxxxxxxxxx",
# secret_access_key: "yyyyyyyyyyyy",
# region: "us-east-1",
# scheme: "https://"
#
# For using third-party S3 clones like wasabi, also do:
# config :ex_aws, :s3,
# host: "s3.wasabisys.com"
# Configure Openstack Swift support if desired.
#
# Many openstack deployments are different, so config is left very open with
# no assumptions made on which provider you're using. This should allow very
# wide support without needing separate handlers for OVH, Rackspace, etc.
#
# config :pleroma, Pleroma.Uploaders.Swift,
# container: "some-container",
# username: "api-username-yyyy",
# password: "api-key-xxxx",
# tenant_id: "<openstack-project/tenant-id>",
# auth_url: "https://keystone-endpoint.provider.com",
# storage_url: "https://swift-endpoint.prodider.com/v1/AUTH_<tenant>/<container>",
# object_url: "https://cdn-endpoint.provider.com/<container>"
#

View file

@ -0,0 +1,19 @@
defmodule Mix.Tasks.ReactivateUser do
use Mix.Task
alias Pleroma.User
@moduledoc """
Reactivate a user
Usage: ``mix reactivate_user <nickname>``
Example: ``mix reactivate_user lain``
"""
def run([nickname]) do
Mix.Task.run("app.start")
with user <- User.get_by_nickname(nickname) do
User.deactivate(user, false)
end
end
end

View file

@ -0,0 +1,24 @@
defmodule Mix.Tasks.RelayFollow do
use Mix.Task
require Logger
alias Pleroma.Web.ActivityPub.Relay
@shortdoc "Follows a remote relay"
@moduledoc """
Follows a remote relay
Usage: ``mix relay_follow <relay_url>``
Example: ``mix relay_follow https://example.org/relay``
"""
def run([target]) do
Mix.Task.run("app.start")
with {:ok, activity} <- Relay.follow(target) do
# put this task to sleep to allow the genserver to push out the messages
:timer.sleep(500)
else
{:error, e} -> Mix.shell().error("Error while following #{target}: #{inspect(e)}")
end
end
end

View file

@ -0,0 +1,23 @@
defmodule Mix.Tasks.RelayUnfollow do
use Mix.Task
require Logger
alias Pleroma.Web.ActivityPub.Relay
@moduledoc """
Unfollows a remote relay
Usage: ``mix relay_follow <relay_url>``
Example: ``mix relay_follow https://example.org/relay``
"""
def run([target]) do
Mix.Task.run("app.start")
with {:ok, activity} <- Relay.follow(target) do
# put this task to sleep to allow the genserver to push out the messages
:timer.sleep(500)
else
{:error, e} -> Mix.shell().error("Error while following #{target}: #{inspect(e)}")
end
end
end

View file

@ -0,0 +1,32 @@
defmodule Mix.Tasks.SetAdmin do
use Mix.Task
alias Pleroma.User
@doc """
Sets admin status
Usage: set_admin nickname [true|false]
"""
def run([nickname | rest]) do
Application.ensure_all_started(:pleroma)
status =
case rest do
[status] -> status == "true"
_ -> true
end
with %User{local: true} = user <- User.get_by_nickname(nickname) do
info =
user.info
|> Map.put("is_admin", !!status)
cng = User.info_changeset(user, %{info: info})
{:ok, user} = User.update_and_set_cache(cng)
IO.puts("Admin status of #{nickname}: #{user.info["is_admin"]}")
else
_ ->
IO.puts("No local user #{nickname}")
end
end
end

View file

@ -0,0 +1,38 @@
defmodule Mix.Tasks.UnsubscribeUser do
use Mix.Task
alias Pleroma.{User, Repo}
require Logger
@moduledoc """
Deactivate and Unsubscribe local users from a user
Usage: ``mix unsubscribe_user <nickname>``
Example: ``mix unsubscribe_user lain``
"""
def run([nickname]) do
Mix.Task.run("app.start")
with %User{} = user <- User.get_by_nickname(nickname) do
Logger.info("Deactivating #{user.nickname}")
User.deactivate(user)
{:ok, friends} = User.get_friends(user)
Enum.each(friends, fn friend ->
user = Repo.get(User, user.id)
Logger.info("Unsubscribing #{friend.nickname} from #{user.nickname}")
User.unfollow(user, friend)
end)
:timer.sleep(500)
user = Repo.get(User, user.id)
if length(user.following) == 0 do
Logger.info("Successfully unsubscribed all followers from #{user.nickname}")
end
end
end
end

View file

@ -82,4 +82,10 @@ def get_create_activity_by_object_ap_id(_), do: nil
def normalize(obj) when is_map(obj), do: Activity.get_by_ap_id(obj["id"])
def normalize(ap_id) when is_binary(ap_id), do: Activity.get_by_ap_id(ap_id)
def normalize(_), do: nil
def get_in_reply_to_activity(%Activity{data: %{"object" => %{"inReplyTo" => ap_id}}}) do
get_create_activity_by_object_ap_id(ap_id)
end
def get_in_reply_to_activity(_), do: nil
end

View file

@ -1,10 +1,22 @@
defmodule Pleroma.Application do
use Application
import Supervisor.Spec
@name "Pleroma"
@version Mix.Project.config()[:version]
def name, do: @name
def version, do: @version
def named_version(), do: @name <> " " <> @version
def user_agent() do
info = "#{Pleroma.Web.base_url()} <#{Pleroma.Config.get([:instance, :email], "")}>"
named_version() <> "; " <> info
end
# See http://elixir-lang.org/docs/stable/elixir/Application.html
# for more information on OTP Applications
@env Mix.env()
def start(_type, _args) do
import Supervisor.Spec
import Cachex.Spec
# Define workers and child supervisors to be supervised
@ -12,18 +24,31 @@ def start(_type, _args) do
[
# Start the Ecto repository
supervisor(Pleroma.Repo, []),
# Start the endpoint when the application starts
supervisor(Pleroma.Web.Endpoint, []),
# Start your own worker by calling: Pleroma.Worker.start_link(arg1, arg2, arg3)
# worker(Pleroma.Worker, [arg1, arg2, arg3]),
worker(Cachex, [
:user_cache,
worker(Pleroma.Emoji, []),
worker(
Cachex,
[
default_ttl: 25000,
ttl_interval: 1000,
limit: 2500
]
]),
:user_cache,
[
default_ttl: 25000,
ttl_interval: 1000,
limit: 2500
]
],
id: :cachex_user
),
worker(
Cachex,
[
:object_cache,
[
default_ttl: 25000,
ttl_interval: 1000,
limit: 2500
]
],
id: :cachex_object
),
worker(
Cachex,
[
@ -39,19 +64,17 @@ def start(_type, _args) do
],
id: :cachex_idem
),
worker(Pleroma.Web.Federator.RetryQueue, []),
worker(Pleroma.Web.Federator, []),
worker(Pleroma.Gopher.Server, []),
worker(Pleroma.Stats, [])
] ++
if Mix.env() == :test,
do: [],
else:
[worker(Pleroma.Web.Streamer, [])] ++
if(
!chat_enabled(),
do: [],
else: [worker(Pleroma.Web.ChatChannel.ChatChannelState, [])]
)
streamer_child() ++
chat_child() ++
[
# Start the endpoint when the application starts
supervisor(Pleroma.Web.Endpoint, []),
worker(Pleroma.Gopher.Server, [])
]
# See http://elixir-lang.org/docs/stable/elixir/Supervisor.html
# for other strategies and supported options
@ -59,7 +82,20 @@ def start(_type, _args) do
Supervisor.start_link(children, opts)
end
defp chat_enabled do
Application.get_env(:pleroma, :chat, []) |> Keyword.get(:enabled)
if Mix.env() == :test do
defp streamer_child(), do: []
defp chat_child(), do: []
else
defp streamer_child() do
[worker(Pleroma.Web.Streamer, [])]
end
defp chat_child() do
if Pleroma.Config.get([:chat, :enabled]) do
[worker(Pleroma.Web.ChatChannel.ChatChannelState, [])]
else
[]
end
end
end
end

56
lib/pleroma/config.ex Normal file
View file

@ -0,0 +1,56 @@
defmodule Pleroma.Config do
defmodule Error do
defexception [:message]
end
def get(key), do: get(key, nil)
def get([key], default), do: get(key, default)
def get([parent_key | keys], default) do
Application.get_env(:pleroma, parent_key)
|> get_in(keys) || default
end
def get(key, default) do
Application.get_env(:pleroma, key, default)
end
def get!(key) do
value = get(key, nil)
if value == nil do
raise(Error, message: "Missing configuration value: #{inspect(key)}")
else
value
end
end
def put([key], value), do: put(key, value)
def put([parent_key | keys], value) do
parent =
Application.get_env(:pleroma, parent_key)
|> put_in(keys, value)
Application.put_env(:pleroma, parent_key, parent)
end
def put(key, value) do
Application.put_env(:pleroma, key, value)
end
def delete([key]), do: delete(key)
def delete([parent_key | keys]) do
{_, parent} =
Application.get_env(:pleroma, parent_key)
|> get_and_update_in(keys, fn _ -> :pop end)
Application.put_env(:pleroma, parent_key, parent)
end
def delete(key) do
Application.delete_env(:pleroma, key)
end
end

194
lib/pleroma/emoji.ex Normal file
View file

@ -0,0 +1,194 @@
defmodule Pleroma.Emoji do
@moduledoc """
The emojis are loaded from:
* the built-in Finmojis (if enabled in configuration),
* the files: `config/emoji.txt` and `config/custom_emoji.txt`
* glob paths
This GenServer stores in an ETS table the list of the loaded emojis, and also allows to reload the list at runtime.
"""
use GenServer
@ets __MODULE__.Ets
@ets_options [:set, :protected, :named_table, {:read_concurrency, true}]
@doc false
def start_link() do
GenServer.start_link(__MODULE__, [], name: __MODULE__)
end
@doc "Reloads the emojis from disk."
@spec reload() :: :ok
def reload() do
GenServer.call(__MODULE__, :reload)
end
@doc "Returns the path of the emoji `name`."
@spec get(String.t()) :: String.t() | nil
def get(name) do
case :ets.lookup(@ets, name) do
[{_, path}] -> path
_ -> nil
end
end
@doc "Returns all the emojos!!"
@spec get_all() :: [{String.t(), String.t()}, ...]
def get_all() do
:ets.tab2list(@ets)
end
@doc false
def init(_) do
@ets = :ets.new(@ets, @ets_options)
GenServer.cast(self(), :reload)
{:ok, nil}
end
@doc false
def handle_cast(:reload, state) do
load()
{:noreply, state}
end
@doc false
def handle_call(:reload, _from, state) do
load()
{:reply, :ok, state}
end
@doc false
def terminate(_, _) do
:ok
end
@doc false
def code_change(_old_vsn, state, _extra) do
load()
{:ok, state}
end
defp load() do
emojis =
(load_finmoji(Keyword.get(Application.get_env(:pleroma, :instance), :finmoji_enabled)) ++
load_from_file("config/emoji.txt") ++
load_from_file("config/custom_emoji.txt") ++
load_from_globs(
Keyword.get(Application.get_env(:pleroma, :emoji, []), :shortcode_globs, [])
))
|> Enum.reject(fn value -> value == nil end)
true = :ets.insert(@ets, emojis)
:ok
end
@finmoji [
"a_trusted_friend",
"alandislands",
"association",
"auroraborealis",
"baby_in_a_box",
"bear",
"black_gold",
"christmasparty",
"crosscountryskiing",
"cupofcoffee",
"education",
"fashionista_finns",
"finnishlove",
"flag",
"forest",
"four_seasons_of_bbq",
"girlpower",
"handshake",
"happiness",
"headbanger",
"icebreaker",
"iceman",
"joulutorttu",
"kaamos",
"kalsarikannit_f",
"kalsarikannit_m",
"karjalanpiirakka",
"kicksled",
"kokko",
"lavatanssit",
"losthopes_f",
"losthopes_m",
"mattinykanen",
"meanwhileinfinland",
"moominmamma",
"nordicfamily",
"out_of_office",
"peacemaker",
"perkele",
"pesapallo",
"polarbear",
"pusa_hispida_saimensis",
"reindeer",
"sami",
"sauna_f",
"sauna_m",
"sauna_whisk",
"sisu",
"stuck",
"suomimainittu",
"superfood",
"swan",
"the_cap",
"the_conductor",
"the_king",
"the_voice",
"theoriginalsanta",
"tomoffinland",
"torillatavataan",
"unbreakable",
"waiting",
"white_nights",
"woollysocks"
]
defp load_finmoji(true) do
Enum.map(@finmoji, fn finmoji ->
{finmoji, "/finmoji/128px/#{finmoji}-128.png"}
end)
end
defp load_finmoji(_), do: []
defp load_from_file(file) do
if File.exists?(file) do
load_from_file_stream(File.stream!(file))
else
[]
end
end
defp load_from_file_stream(stream) do
stream
|> Stream.map(&String.strip/1)
|> Stream.map(fn line ->
case String.split(line, ~r/,\s*/) do
[name, file] -> {name, file}
_ -> nil
end
end)
|> Enum.to_list()
end
defp load_from_globs(globs) do
static_path = Path.join(:code.priv_dir(:pleroma), "static")
paths =
Enum.map(globs, fn glob ->
Path.join(static_path, glob)
|> Path.wildcard()
end)
|> Enum.concat()
Enum.map(paths, fn path ->
shortcode = Path.basename(path, Path.extname(path))
external_path = Path.join("/", Path.relative_to(path, static_path))
{shortcode, external_path}
end)
end
end

90
lib/pleroma/filter.ex Normal file
View file

@ -0,0 +1,90 @@
defmodule Pleroma.Filter do
use Ecto.Schema
import Ecto.{Changeset, Query}
alias Pleroma.{User, Repo, Activity}
schema "filters" do
belongs_to(:user, Pleroma.User)
field(:filter_id, :integer)
field(:hide, :boolean, default: false)
field(:whole_word, :boolean, default: true)
field(:phrase, :string)
field(:context, {:array, :string})
field(:expires_at, :utc_datetime)
timestamps()
end
def get(id, %{id: user_id} = _user) do
query =
from(
f in Pleroma.Filter,
where: f.filter_id == ^id,
where: f.user_id == ^user_id
)
Repo.one(query)
end
def get_filters(%Pleroma.User{id: user_id} = user) do
query =
from(
f in Pleroma.Filter,
where: f.user_id == ^user_id
)
Repo.all(query)
end
def create(%Pleroma.Filter{user_id: user_id, filter_id: nil} = filter) do
# If filter_id wasn't given, use the max filter_id for this user plus 1.
# XXX This could result in a race condition if a user tries to add two
# different filters for their account from two different clients at the
# same time, but that should be unlikely.
max_id_query =
from(
f in Pleroma.Filter,
where: f.user_id == ^user_id,
select: max(f.filter_id)
)
filter_id =
case Repo.one(max_id_query) do
# Start allocating from 1
nil ->
1
max_id ->
max_id + 1
end
filter
|> Map.put(:filter_id, filter_id)
|> Repo.insert()
end
def create(%Pleroma.Filter{} = filter) do
Repo.insert(filter)
end
def delete(%Pleroma.Filter{id: filter_key} = filter) when is_number(filter_key) do
Repo.delete(filter)
end
def delete(%Pleroma.Filter{id: filter_key} = filter) when is_nil(filter_key) do
%Pleroma.Filter{id: id} = get(filter.filter_id, %{id: filter.user_id})
filter
|> Map.put(:id, id)
|> Repo.delete()
end
def update(%Pleroma.Filter{} = filter) do
destination = Map.from_struct(filter)
Pleroma.Filter.get(filter.filter_id, %{id: filter.user_id})
|> cast(destination, [:phrase, :context, :hide, :expires_at, :whole_word])
|> Repo.update()
end
end

View file

@ -1,6 +1,8 @@
defmodule Pleroma.Formatter do
alias Pleroma.User
alias Pleroma.Web.MediaProxy
alias Pleroma.HTML
alias Pleroma.Emoji
@tag_regex ~r/\#\w+/u
def parse_tags(text, data \\ %{}) do
@ -16,7 +18,7 @@ def parse_tags(text, data \\ %{}) do
def parse_mentions(text) do
# Modified from https://www.w3.org/TR/html5/forms.html#valid-e-mail-address
regex =
~r/@[a-zA-Z0-9.!#$%&'*+\/=?^_`{|}~-]+@?[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?(?:\.[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?)*/u
~r/@[a-zA-Z0-9.!#$%&'*+\/=?^_`{|}~-]*@?[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?(?:\.[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?)*/u
Regex.scan(regex, text)
|> List.flatten()
@ -27,125 +29,16 @@ def parse_mentions(text) do
|> Enum.filter(fn {_match, user} -> user end)
end
@finmoji [
"a_trusted_friend",
"alandislands",
"association",
"auroraborealis",
"baby_in_a_box",
"bear",
"black_gold",
"christmasparty",
"crosscountryskiing",
"cupofcoffee",
"education",
"fashionista_finns",
"finnishlove",
"flag",
"forest",
"four_seasons_of_bbq",
"girlpower",
"handshake",
"happiness",
"headbanger",
"icebreaker",
"iceman",
"joulutorttu",
"kaamos",
"kalsarikannit_f",
"kalsarikannit_m",
"karjalanpiirakka",
"kicksled",
"kokko",
"lavatanssit",
"losthopes_f",
"losthopes_m",
"mattinykanen",
"meanwhileinfinland",
"moominmamma",
"nordicfamily",
"out_of_office",
"peacemaker",
"perkele",
"pesapallo",
"polarbear",
"pusa_hispida_saimensis",
"reindeer",
"sami",
"sauna_f",
"sauna_m",
"sauna_whisk",
"sisu",
"stuck",
"suomimainittu",
"superfood",
"swan",
"the_cap",
"the_conductor",
"the_king",
"the_voice",
"theoriginalsanta",
"tomoffinland",
"torillatavataan",
"unbreakable",
"waiting",
"white_nights",
"woollysocks"
]
def emojify(text) do
emojify(text, Emoji.get_all())
end
@finmoji_with_filenames Enum.map(@finmoji, fn finmoji ->
{finmoji, "/finmoji/128px/#{finmoji}-128.png"}
end)
@emoji_from_file (with {:ok, default} <- File.read("config/emoji.txt") do
custom =
with {:ok, custom} <- File.read("config/custom_emoji.txt") do
custom
else
_e -> ""
end
(default <> "\n" <> custom)
|> String.trim()
|> String.split(~r/\n+/)
|> Enum.map(fn line ->
[name, file] = String.split(line, ~r/,\s*/)
{name, file}
end)
else
_ -> []
end)
@emoji_from_globs (
static_path = Path.join(:code.priv_dir(:pleroma), "static")
globs =
Application.get_env(:pleroma, :emoji, [])
|> Keyword.get(:shortcode_globs, [])
paths =
Enum.map(globs, fn glob ->
Path.join(static_path, glob)
|> Path.wildcard()
end)
|> Enum.concat()
Enum.map(paths, fn path ->
shortcode = Path.basename(path, Path.extname(path))
external_path = Path.join("/", Path.relative_to(path, static_path))
{shortcode, external_path}
end)
)
@emoji @finmoji_with_filenames ++ @emoji_from_globs ++ @emoji_from_file
def emojify(text, emoji \\ @emoji)
def emojify(text, nil), do: text
def emojify(text, emoji) do
Enum.reduce(emoji, text, fn {emoji, file}, text ->
emoji = HtmlSanitizeEx.strip_tags(emoji)
file = HtmlSanitizeEx.strip_tags(file)
emoji = HTML.strip_tags(emoji)
file = HTML.strip_tags(file)
String.replace(
text,
@ -154,41 +47,27 @@ def emojify(text, emoji) do
MediaProxy.url(file)
}' />"
)
|> HTML.filter_tags()
end)
end
def get_emoji(text) do
Enum.filter(@emoji, fn {emoji, _} -> String.contains?(text, ":#{emoji}:") end)
def get_emoji(text) when is_binary(text) do
Enum.filter(Emoji.get_all(), fn {emoji, _} -> String.contains?(text, ":#{emoji}:") end)
end
def get_custom_emoji() do
@emoji
end
def get_emoji(_), do: []
@link_regex ~r/[0-9a-z+\-\.]+:[0-9a-z$-_.+!*'(),]+/ui
# IANA got a list https://www.iana.org/assignments/uri-schemes/ but
# Stuff like ipfs isnt in it
# There is very niche stuff
@uri_schemes [
"https://",
"http://",
"dat://",
"dweb://",
"gopher://",
"ipfs://",
"ipns://",
"irc:",
"ircs:",
"magnet:",
"mailto:",
"mumble:",
"ssb://",
"xmpp:"
]
@uri_schemes Application.get_env(:pleroma, :uri_schemes, [])
@valid_schemes Keyword.get(@uri_schemes, :valid_schemes, [])
# TODO: make it use something other than @link_regex
def html_escape(text) do
def html_escape(text, "text/html") do
HTML.filter_tags(text)
end
def html_escape(text, "text/plain") do
Regex.split(@link_regex, text, include_captures: true)
|> Enum.map_every(2, fn chunk ->
{:safe, part} = Phoenix.HTML.html_escape(chunk)
@ -199,14 +78,10 @@ def html_escape(text) do
@doc "changes scheme:... urls to html links"
def add_links({subs, text}) do
additionnal_schemes =
Application.get_env(:pleroma, :uri_schemes, [])
|> Keyword.get(:additionnal_schemes, [])
links =
text
|> String.split([" ", "\t", "<br>"])
|> Enum.filter(fn word -> String.starts_with?(word, @uri_schemes ++ additionnal_schemes) end)
|> Enum.filter(fn word -> String.starts_with?(word, @valid_schemes) end)
|> Enum.filter(fn word -> Regex.match?(@link_regex, word) end)
|> Enum.map(fn url -> {Ecto.UUID.generate(), url} end)
|> Enum.sort_by(fn {_, url} -> -String.length(url) end)
@ -218,13 +93,7 @@ def add_links({subs, text}) do
subs =
subs ++
Enum.map(links, fn {uuid, url} ->
{:safe, link} = Phoenix.HTML.Link.link(url, to: url)
link =
link
|> IO.iodata_to_binary()
{uuid, link}
{uuid, "<a href=\"#{url}\">#{url}</a>"}
end)
{subs, uuid_text}
@ -246,7 +115,12 @@ def add_user_links({subs, text}, mentions) do
subs =
subs ++
Enum.map(mentions, fn {match, %User{ap_id: ap_id, info: info}, uuid} ->
ap_id = info["source_data"]["url"] || ap_id
ap_id =
if is_binary(info["source_data"]["url"]) do
info["source_data"]["url"]
else
ap_id
end
short_match = String.split(match, "@") |> tl() |> hd()

View file

@ -1,32 +1,33 @@
defmodule Pleroma.Gopher.Server do
use GenServer
require Logger
@gopher Application.get_env(:pleroma, :gopher)
def start_link() do
ip = Keyword.get(@gopher, :ip, {0, 0, 0, 0})
port = Keyword.get(@gopher, :port, 1234)
GenServer.start_link(__MODULE__, [ip, port], [])
config = Pleroma.Config.get(:gopher, [])
ip = Keyword.get(config, :ip, {0, 0, 0, 0})
port = Keyword.get(config, :port, 1234)
if Keyword.get(config, :enabled, false) do
GenServer.start_link(__MODULE__, [ip, port], [])
else
Logger.info("Gopher server disabled")
:ignore
end
end
def init([ip, port]) do
if Keyword.get(@gopher, :enabled, false) do
Logger.info("Starting gopher server on #{port}")
Logger.info("Starting gopher server on #{port}")
:ranch.start_listener(
:gopher,
100,
:ranch_tcp,
[port: port],
__MODULE__.ProtocolHandler,
[]
)
:ranch.start_listener(
:gopher,
100,
:ranch_tcp,
[port: port],
__MODULE__.ProtocolHandler,
[]
)
{:ok, %{ip: ip, port: port}}
else
Logger.info("Gopher server disabled")
{:ok, nil}
end
{:ok, %{ip: ip, port: port}}
end
end
@ -35,9 +36,7 @@ defmodule Pleroma.Gopher.Server.ProtocolHandler do
alias Pleroma.User
alias Pleroma.Activity
alias Pleroma.Repo
@instance Application.get_env(:pleroma, :instance)
@gopher Application.get_env(:pleroma, :gopher)
alias Pleroma.HTML
def start_link(ref, socket, transport, opts) do
pid = spawn_link(__MODULE__, :init, [ref, socket, transport, opts])
@ -61,7 +60,7 @@ def info(text) do
def link(name, selector, type \\ 1) do
address = Pleroma.Web.Endpoint.host()
port = Keyword.get(@gopher, :port, 1234)
port = Pleroma.Config.get([:gopher, :port], 1234)
"#{type}#{name}\t#{selector}\t#{address}\t#{port}\r\n"
end
@ -78,17 +77,13 @@ def render_activities(activities) do
link("Post ##{activity.id} by #{user.nickname}", "/notices/#{activity.id}") <>
info("#{like_count} likes, #{announcement_count} repeats") <>
"i\tfake\t(NULL)\t0\r\n" <>
info(
HtmlSanitizeEx.strip_tags(
String.replace(activity.data["object"]["content"], "<br>", "\r")
)
)
info(HTML.strip_tags(String.replace(activity.data["object"]["content"], "<br>", "\r")))
end)
|> Enum.join("i\tfake\t(NULL)\t0\r\n")
end
def response("") do
info("Welcome to #{Keyword.get(@instance, :name, "Pleroma")}!") <>
info("Welcome to #{Pleroma.Config.get([:instance, :name], "Pleroma")}!") <>
link("Public Timeline", "/main/public") <>
link("Federated Timeline", "/main/all") <> ".\r\n"
end

185
lib/pleroma/html.ex Normal file
View file

@ -0,0 +1,185 @@
defmodule Pleroma.HTML do
alias HtmlSanitizeEx.Scrubber
defp get_scrubbers(scrubber) when is_atom(scrubber), do: [scrubber]
defp get_scrubbers(scrubbers) when is_list(scrubbers), do: scrubbers
defp get_scrubbers(_), do: [Pleroma.HTML.Scrubber.Default]
def get_scrubbers() do
Pleroma.Config.get([:markup, :scrub_policy])
|> get_scrubbers
end
def filter_tags(html, nil) do
get_scrubbers()
|> Enum.reduce(html, fn scrubber, html ->
filter_tags(html, scrubber)
end)
end
def filter_tags(html, scrubber) do
html |> Scrubber.scrub(scrubber)
end
def filter_tags(html), do: filter_tags(html, nil)
def strip_tags(html) do
html |> Scrubber.scrub(Scrubber.StripTags)
end
end
defmodule Pleroma.HTML.Scrubber.TwitterText do
@moduledoc """
An HTML scrubbing policy which limits to twitter-style text. Only
paragraphs, breaks and links are allowed through the filter.
"""
@markup Application.get_env(:pleroma, :markup)
@uri_schemes Application.get_env(:pleroma, :uri_schemes, [])
@valid_schemes Keyword.get(@uri_schemes, :valid_schemes, [])
require HtmlSanitizeEx.Scrubber.Meta
alias HtmlSanitizeEx.Scrubber.Meta
Meta.remove_cdata_sections_before_scrub()
Meta.strip_comments()
# links
Meta.allow_tag_with_uri_attributes("a", ["href"], @valid_schemes)
Meta.allow_tag_with_these_attributes("a", ["name", "title"])
# paragraphs and linebreaks
Meta.allow_tag_with_these_attributes("br", [])
Meta.allow_tag_with_these_attributes("p", [])
# microformats
Meta.allow_tag_with_these_attributes("span", [])
# allow inline images for custom emoji
@allow_inline_images Keyword.get(@markup, :allow_inline_images)
if @allow_inline_images do
# restrict img tags to http/https only, because of MediaProxy.
Meta.allow_tag_with_uri_attributes("img", ["src"], ["http", "https"])
Meta.allow_tag_with_these_attributes("img", [
"width",
"height",
"title",
"alt"
])
end
Meta.strip_everything_not_covered()
end
defmodule Pleroma.HTML.Scrubber.Default do
@doc "The default HTML scrubbing policy: no "
require HtmlSanitizeEx.Scrubber.Meta
alias HtmlSanitizeEx.Scrubber.Meta
@markup Application.get_env(:pleroma, :markup)
@uri_schemes Application.get_env(:pleroma, :uri_schemes, [])
@valid_schemes Keyword.get(@uri_schemes, :valid_schemes, [])
Meta.remove_cdata_sections_before_scrub()
Meta.strip_comments()
Meta.allow_tag_with_uri_attributes("a", ["href"], @valid_schemes)
Meta.allow_tag_with_these_attributes("a", ["name", "title"])
Meta.allow_tag_with_these_attributes("abbr", ["title"])
Meta.allow_tag_with_these_attributes("b", [])
Meta.allow_tag_with_these_attributes("blockquote", [])
Meta.allow_tag_with_these_attributes("br", [])
Meta.allow_tag_with_these_attributes("code", [])
Meta.allow_tag_with_these_attributes("del", [])
Meta.allow_tag_with_these_attributes("em", [])
Meta.allow_tag_with_these_attributes("i", [])
Meta.allow_tag_with_these_attributes("li", [])
Meta.allow_tag_with_these_attributes("ol", [])
Meta.allow_tag_with_these_attributes("p", [])
Meta.allow_tag_with_these_attributes("pre", [])
Meta.allow_tag_with_these_attributes("span", [])
Meta.allow_tag_with_these_attributes("strong", [])
Meta.allow_tag_with_these_attributes("u", [])
Meta.allow_tag_with_these_attributes("ul", [])
@allow_inline_images Keyword.get(@markup, :allow_inline_images)
if @allow_inline_images do
# restrict img tags to http/https only, because of MediaProxy.
Meta.allow_tag_with_uri_attributes("img", ["src"], ["http", "https"])
Meta.allow_tag_with_these_attributes("img", [
"width",
"height",
"title",
"alt"
])
end
@allow_tables Keyword.get(@markup, :allow_tables)
if @allow_tables do
Meta.allow_tag_with_these_attributes("table", [])
Meta.allow_tag_with_these_attributes("tbody", [])
Meta.allow_tag_with_these_attributes("td", [])
Meta.allow_tag_with_these_attributes("th", [])
Meta.allow_tag_with_these_attributes("thead", [])
Meta.allow_tag_with_these_attributes("tr", [])
end
@allow_headings Keyword.get(@markup, :allow_headings)
if @allow_headings do
Meta.allow_tag_with_these_attributes("h1", [])
Meta.allow_tag_with_these_attributes("h2", [])
Meta.allow_tag_with_these_attributes("h3", [])
Meta.allow_tag_with_these_attributes("h4", [])
Meta.allow_tag_with_these_attributes("h5", [])
end
@allow_fonts Keyword.get(@markup, :allow_fonts)
if @allow_fonts do
Meta.allow_tag_with_these_attributes("font", ["face"])
end
Meta.strip_everything_not_covered()
end
defmodule Pleroma.HTML.Transform.MediaProxy do
@moduledoc "Transforms inline image URIs to use MediaProxy."
alias Pleroma.Web.MediaProxy
def before_scrub(html), do: html
def scrub_attribute("img", {"src", "http" <> target}) do
media_url =
("http" <> target)
|> MediaProxy.url()
{"src", media_url}
end
def scrub_attribute(tag, attribute), do: attribute
def scrub({"img", attributes, children}) do
attributes =
attributes
|> Enum.map(fn attr -> scrub_attribute("img", attr) end)
|> Enum.reject(&is_nil(&1))
{"img", attributes, children}
end
def scrub({:comment, children}), do: ""
def scrub({tag, attributes, children}), do: {tag, attributes, children}
def scrub({tag, children}), do: children
def scrub(text), do: text
end

View file

@ -1,13 +1,37 @@
defmodule Pleroma.HTTP do
use HTTPoison.Base
require HTTPoison
def request(method, url, body \\ "", headers \\ [], options \\ []) do
options =
process_request_options(options)
|> process_sni_options(url)
HTTPoison.request(method, url, body, headers, options)
end
defp process_sni_options(options, url) do
uri = URI.parse(url)
host = uri.host |> to_charlist()
case uri.scheme do
"https" -> options ++ [ssl: [server_name_indication: host]]
_ -> options
end
end
def process_request_options(options) do
config = Application.get_env(:pleroma, :http, [])
proxy = Keyword.get(config, :proxy_url, nil)
options = options ++ [hackney: [pool: :default]]
case proxy do
nil -> options
_ -> options ++ [proxy: proxy]
end
end
def get(url, headers \\ [], options \\ []), do: request(:get, url, "", headers, options)
def post(url, body, headers \\ [], options \\ []),
do: request(:post, url, body, headers, options)
end

View file

@ -69,6 +69,25 @@ def get_lists_from_activity(%Activity{actor: ap_id}) do
Repo.all(query)
end
# Get lists to which the account belongs.
def get_lists_account_belongs(%User{} = owner, account_id) do
user = Repo.get(User, account_id)
query =
from(
l in Pleroma.List,
where:
l.user_id == ^owner.id and
fragment(
"? = ANY(?)",
^user.follower_address,
l.following
)
)
Repo.all(query)
end
def rename(%Pleroma.List{} = list, title) do
list
|> title_changeset(%{title: title})

108
lib/pleroma/mime.ex Normal file
View file

@ -0,0 +1,108 @@
defmodule Pleroma.MIME do
@moduledoc """
Returns the mime-type of a binary and optionally a normalized file-name.
"""
@default "application/octet-stream"
@read_bytes 31
@spec file_mime_type(String.t()) ::
{:ok, content_type :: String.t(), filename :: String.t()} | {:error, any()} | :error
def file_mime_type(path, filename) do
with {:ok, content_type} <- file_mime_type(path),
filename <- fix_extension(filename, content_type) do
{:ok, content_type, filename}
end
end
@spec file_mime_type(String.t()) :: {:ok, String.t()} | {:error, any()} | :error
def file_mime_type(filename) do
File.open(filename, [:read], fn f ->
check_mime_type(IO.binread(f, @read_bytes))
end)
end
def bin_mime_type(binary, filename) do
with {:ok, content_type} <- bin_mime_type(binary),
filename <- fix_extension(filename, content_type) do
{:ok, content_type, filename}
end
end
@spec bin_mime_type(binary()) :: {:ok, String.t()} | :error
def bin_mime_type(<<head::binary-size(@read_bytes), _::binary>>) do
{:ok, check_mime_type(head)}
end
def mime_type(<<_::binary>>), do: {:ok, @default}
def bin_mime_type(_), do: :error
defp fix_extension(filename, content_type) do
parts = String.split(filename, ".")
new_filename =
if length(parts) > 1 do
Enum.drop(parts, -1) |> Enum.join(".")
else
Enum.join(parts)
end
cond do
content_type == "application/octet-stream" ->
filename
ext = List.first(MIME.extensions(content_type)) ->
new_filename <> "." <> ext
true ->
Enum.join([new_filename, String.split(content_type, "/") |> List.last()], ".")
end
end
defp check_mime_type(<<0x89, 0x50, 0x4E, 0x47, 0x0D, 0x0A, 0x1A, 0x0A, _::binary>>) do
"image/png"
end
defp check_mime_type(<<0x47, 0x49, 0x46, 0x38, _, 0x61, _::binary>>) do
"image/gif"
end
defp check_mime_type(<<0xFF, 0xD8, 0xFF, _::binary>>) do
"image/jpeg"
end
defp check_mime_type(<<0x1A, 0x45, 0xDF, 0xA3, _::binary>>) do
"video/webm"
end
defp check_mime_type(<<0x00, 0x00, 0x00, _, 0x66, 0x74, 0x79, 0x70, _::binary>>) do
"video/mp4"
end
defp check_mime_type(<<0x49, 0x44, 0x33, _::binary>>) do
"audio/mpeg"
end
defp check_mime_type(<<255, 251, _, 68, 0, 0, 0, 0, _::binary>>) do
"audio/mpeg"
end
defp check_mime_type(
<<0x4F, 0x67, 0x67, 0x53, 0x00, 0x02, 0x00, 0x00, _::size(160), 0x80, 0x74, 0x68, 0x65,
0x6F, 0x72, 0x61, _::binary>>
) do
"video/ogg"
end
defp check_mime_type(<<0x4F, 0x67, 0x67, 0x53, 0x00, 0x02, 0x00, 0x00, _::binary>>) do
"audio/ogg"
end
defp check_mime_type(<<0x52, 0x49, 0x46, 0x46, _::binary>>) do
"audio/wav"
end
defp check_mime_type(_) do
@default
end
end

View file

@ -1,6 +1,6 @@
defmodule Pleroma.Notification do
use Ecto.Schema
alias Pleroma.{User, Activity, Notification, Repo}
alias Pleroma.{User, Activity, Notification, Repo, Object}
import Ecto.Query
schema "notifications" do
@ -42,6 +42,20 @@ def for_user(user, opts \\ %{}) do
Repo.all(query)
end
def set_read_up_to(%{id: user_id} = _user, id) do
query =
from(
n in Notification,
where: n.user_id == ^user_id,
where: n.id <= ^id,
update: [
set: [seen: true]
]
)
Repo.update_all(query, [])
end
def get(%{id: user_id} = _user, id) do
query =
from(
@ -81,7 +95,7 @@ def dismiss(%{id: user_id} = _user, id) do
def create_notifications(%Activity{id: _, data: %{"to" => _, "type" => type}} = activity)
when type in ["Create", "Like", "Announce", "Follow"] do
users = User.get_notified_from_activity(activity)
users = get_notified_from_activity(activity)
notifications = Enum.map(users, fn user -> create_notification(activity, user) end)
{:ok, notifications}
@ -99,4 +113,64 @@ def create_notification(%Activity{} = activity, %User{} = user) do
notification
end
end
def get_notified_from_activity(activity, local_only \\ true)
def get_notified_from_activity(
%Activity{data: %{"to" => _, "type" => type} = data} = activity,
local_only
)
when type in ["Create", "Like", "Announce", "Follow"] do
recipients =
[]
|> maybe_notify_to_recipients(activity)
|> maybe_notify_mentioned_recipients(activity)
|> Enum.uniq()
User.get_users_from_set(recipients, local_only)
end
def get_notified_from_activity(_, local_only), do: []
defp maybe_notify_to_recipients(
recipients,
%Activity{data: %{"to" => to, "type" => type}} = activity
) do
recipients ++ to
end
defp maybe_notify_mentioned_recipients(
recipients,
%Activity{data: %{"to" => to, "type" => type} = data} = activity
)
when type == "Create" do
object = Object.normalize(data["object"])
object_data =
cond do
!is_nil(object) ->
object.data
is_map(data["object"]) ->
data["object"]
true ->
%{}
end
tagged_mentions = maybe_extract_mentions(object_data)
recipients ++ tagged_mentions
end
defp maybe_notify_mentioned_recipients(recipients, _), do: recipients
defp maybe_extract_mentions(%{"tag" => tag}) do
tag
|> Enum.filter(fn x -> is_map(x) end)
|> Enum.filter(fn x -> x["type"] == "Mention" end)
|> Enum.map(fn x -> x["href"] end)
end
defp maybe_extract_mentions(_), do: []
end

View file

@ -1,6 +1,6 @@
defmodule Pleroma.Object do
use Ecto.Schema
alias Pleroma.{Repo, Object}
alias Pleroma.{Repo, Object, Activity}
import Ecto.{Query, Changeset}
schema "objects" do
@ -31,13 +31,15 @@ def normalize(obj) when is_map(obj), do: Object.get_by_ap_id(obj["id"])
def normalize(ap_id) when is_binary(ap_id), do: Object.get_by_ap_id(ap_id)
def normalize(_), do: nil
def get_cached_by_ap_id(ap_id) do
if Mix.env() == :test do
if Mix.env() == :test do
def get_cached_by_ap_id(ap_id) do
get_by_ap_id(ap_id)
else
end
else
def get_cached_by_ap_id(ap_id) do
key = "object:#{ap_id}"
Cachex.fetch!(:user_cache, key, fn _ ->
Cachex.fetch!(:object_cache, key, fn _ ->
object = get_by_ap_id(ap_id)
if object do
@ -52,4 +54,12 @@ def get_cached_by_ap_id(ap_id) do
def context_mapping(context) do
Object.change(%Object{}, %{data: %{"id" => context}})
end
def delete(%Object{data: %{"id" => id}} = object) do
with Repo.delete(object),
Repo.delete_all(Activity.all_non_create_by_object_ap_id_q(id)),
{:ok, true} <- Cachex.del(:object_cache, "object:#{id}") do
{:ok, object}
end
end
end

View file

@ -9,54 +9,34 @@ def init(options) do
def call(%{assigns: %{user: %User{}}} = conn, _), do: conn
def call(conn, opts) do
with {:ok, username, password} <- decode_header(conn),
{:ok, user} <- opts[:fetcher].(username),
false <- !!user.info["deactivated"],
saved_user_id <- get_session(conn, :user_id),
{:ok, verified_user} <- verify(user, password, saved_user_id) do
def call(
%{
assigns: %{
auth_user: %{password_hash: password_hash} = auth_user,
auth_credentials: %{password: password}
}
} = conn,
_
) do
if Pbkdf2.checkpw(password, password_hash) do
conn
|> assign(:user, verified_user)
|> put_session(:user_id, verified_user.id)
|> assign(:user, auth_user)
else
_ -> conn |> halt_or_continue(opts)
conn
end
end
# Short-circuit if we have a cookie with the id for the given user.
defp verify(%{id: id} = user, _password, id) do
{:ok, user}
end
defp verify(nil, _password, _user_id) do
def call(
%{
assigns: %{
auth_credentials: %{password: password}
}
} = conn,
_
) do
Pbkdf2.dummy_checkpw()
:error
end
defp verify(user, password, _user_id) do
if Pbkdf2.checkpw(password, user.password_hash) do
{:ok, user}
else
:error
end
end
defp decode_header(conn) do
with ["Basic " <> header] <- get_req_header(conn, "authorization"),
{:ok, userinfo} <- Base.decode64(header),
[username, password] <- String.split(userinfo, ":", parts: 2) do
{:ok, username, password}
end
end
defp halt_or_continue(conn, %{optional: true}) do
conn |> assign(:user, nil)
end
defp halt_or_continue(conn, _) do
conn
|> put_resp_content_type("application/json")
|> send_resp(403, Jason.encode!(%{error: "Invalid credentials."}))
|> halt
end
def call(conn, _), do: conn
end

View file

@ -0,0 +1,21 @@
defmodule Pleroma.Plugs.BasicAuthDecoderPlug do
import Plug.Conn
def init(options) do
options
end
def call(conn, opts) do
with ["Basic " <> header] <- get_req_header(conn, "authorization"),
{:ok, userinfo} <- Base.decode64(header),
[username, password] <- String.split(userinfo, ":", parts: 2) do
conn
|> assign(:auth_credentials, %{
username: username,
password: password
})
else
_ -> conn
end
end
end

View file

@ -0,0 +1,19 @@
defmodule Pleroma.Plugs.EnsureAuthenticatedPlug do
import Plug.Conn
alias Pleroma.User
def init(options) do
options
end
def call(%{assigns: %{user: %User{}}} = conn, _) do
conn
end
def call(conn, _) do
conn
|> put_resp_content_type("application/json")
|> send_resp(403, Jason.encode!(%{error: "Invalid credentials."}))
|> halt
end
end

View file

@ -0,0 +1,14 @@
defmodule Pleroma.Plugs.EnsureUserKeyPlug do
import Plug.Conn
def init(opts) do
opts
end
def call(%{assigns: %{user: _}} = conn, _), do: conn
def call(conn, _) do
conn
|> assign(:user, nil)
end
end

View file

@ -0,0 +1,18 @@
defmodule Pleroma.Web.FederatingPlug do
import Plug.Conn
def init(options) do
options
end
def call(conn, opts) do
if Keyword.get(Application.get_env(:pleroma, :instance), :federating) do
conn
else
conn
|> put_status(404)
|> Phoenix.Controller.render(Pleroma.Web.ErrorView, "404.json")
|> halt()
end
end
end

View file

@ -0,0 +1,63 @@
defmodule Pleroma.Plugs.HTTPSecurityPlug do
alias Pleroma.Config
import Plug.Conn
def init(opts), do: opts
def call(conn, options) do
if Config.get([:http_security, :enabled]) do
conn =
merge_resp_headers(conn, headers())
|> maybe_send_sts_header(Config.get([:http_security, :sts]))
else
conn
end
end
defp headers do
referrer_policy = Config.get([:http_security, :referrer_policy])
[
{"x-xss-protection", "1; mode=block"},
{"x-permitted-cross-domain-policies", "none"},
{"x-frame-options", "DENY"},
{"x-content-type-options", "nosniff"},
{"referrer-policy", referrer_policy},
{"x-download-options", "noopen"},
{"content-security-policy", csp_string() <> ";"}
]
end
defp csp_string do
protocol = Config.get([Pleroma.Web.Endpoint, :protocol])
[
"default-src 'none'",
"base-uri 'self'",
"frame-ancestors 'none'",
"img-src 'self' data: https:",
"media-src 'self' https:",
"style-src 'self' 'unsafe-inline'",
"font-src 'self'",
"script-src 'self'",
"connect-src 'self' " <> String.replace(Pleroma.Web.Endpoint.static_url(), "http", "ws"),
"manifest-src 'self'",
if @protocol == "https" do
"upgrade-insecure-requests"
end
]
|> Enum.join("; ")
end
defp maybe_send_sts_header(conn, true) do
max_age_sts = Config.get([:http_security, :sts_max_age])
max_age_ct = Config.get([:http_security, :ct_max_age])
merge_resp_headers(conn, [
{"strict-transport-security", "max-age=#{max_age_sts}; includeSubDomains"},
{"expect-ct", "enforce, max-age=#{max_age_ct}"}
])
end
defp maybe_send_sts_header(conn, _), do: conn
end

View file

@ -0,0 +1,35 @@
defmodule Pleroma.Plugs.LegacyAuthenticationPlug do
import Plug.Conn
alias Pleroma.User
def init(options) do
options
end
def call(%{assigns: %{user: %User{}}} = conn, _), do: conn
def call(
%{
assigns: %{
auth_user: %{password_hash: "$6$" <> _ = password_hash} = auth_user,
auth_credentials: %{password: password}
}
} = conn,
_
) do
with ^password_hash <- :crypt.crypt(password, password_hash),
{:ok, user} <-
User.reset_password(auth_user, %{password: password, password_confirmation: password}) do
conn
|> assign(:auth_user, user)
|> assign(:user, user)
else
_ ->
conn
end
end
def call(conn, _) do
conn
end
end

View file

@ -0,0 +1,18 @@
defmodule Pleroma.Plugs.SessionAuthenticationPlug do
import Plug.Conn
alias Pleroma.User
def init(options) do
options
end
def call(conn, _) do
with saved_user_id <- get_session(conn, :user_id),
%{auth_user: %{id: ^saved_user_id}} <- conn.assigns do
conn
|> assign(:user, conn.assigns.auth_user)
else
_ -> conn
end
end
end

View file

@ -0,0 +1,15 @@
defmodule Pleroma.Plugs.SetUserSessionIdPlug do
import Plug.Conn
alias Pleroma.User
def init(opts) do
opts
end
def call(%{assigns: %{user: %User{id: id}}} = conn, _) do
conn
|> put_session(:user_id, id)
end
def call(conn, _), do: conn
end

View file

@ -0,0 +1,78 @@
defmodule Pleroma.Plugs.UploadedMedia do
@moduledoc """
"""
import Plug.Conn
require Logger
@behaviour Plug
# no slashes
@path "media"
@cache_control %{
default: "public, max-age=1209600",
error: "public, must-revalidate, max-age=160"
}
def init(_opts) do
static_plug_opts =
[]
|> Keyword.put(:from, "__unconfigured_media_plug")
|> Keyword.put(:at, "/__unconfigured_media_plug")
|> Plug.Static.init()
%{static_plug_opts: static_plug_opts}
end
def call(conn = %{request_path: <<"/", @path, "/", file::binary>>}, opts) do
config = Pleroma.Config.get([Pleroma.Upload])
with uploader <- Keyword.fetch!(config, :uploader),
proxy_remote = Keyword.get(config, :proxy_remote, false),
{:ok, get_method} <- uploader.get_file(file) do
get_media(conn, get_method, proxy_remote, opts)
else
_ ->
conn
|> send_resp(500, "Failed")
|> halt()
end
end
def call(conn, _opts), do: conn
defp get_media(conn, {:static_dir, directory}, _, opts) do
static_opts =
Map.get(opts, :static_plug_opts)
|> Map.put(:at, [@path])
|> Map.put(:from, directory)
conn = Plug.Static.call(conn, static_opts)
if conn.halted do
conn
else
conn
|> send_resp(404, "Not found")
|> halt()
end
end
defp get_media(conn, {:url, url}, true, _) do
conn
|> Pleroma.ReverseProxy.call(url, Pleroma.Config.get([Pleroma.Upload, :proxy_opts], []))
end
defp get_media(conn, {:url, url}, _, _) do
conn
|> Phoenix.Controller.redirect(external: url)
|> halt()
end
defp get_media(conn, unknown, _, _) do
Logger.error("#{__MODULE__}: Unknown get startegy: #{inspect(unknown)}")
conn
|> send_resp(500, "Internal Error")
|> halt()
end
end

View file

@ -0,0 +1,17 @@
defmodule Pleroma.Plugs.UserEnabledPlug do
import Plug.Conn
alias Pleroma.User
def init(options) do
options
end
def call(%{assigns: %{user: %User{info: %{"deactivated" => true}}}} = conn, _) do
conn
|> assign(:user, nil)
end
def call(conn, _) do
conn
end
end

View file

@ -0,0 +1,34 @@
defmodule Pleroma.Plugs.UserFetcherPlug do
import Plug.Conn
alias Pleroma.Repo
alias Pleroma.User
def init(options) do
options
end
def call(conn, options) do
with %{auth_credentials: %{username: username}} <- conn.assigns,
{:ok, %User{} = user} <- user_fetcher(username) do
conn
|> assign(:auth_user, user)
else
_ -> conn
end
end
defp user_fetcher(username_or_email) do
{
:ok,
cond do
# First, try logging in as if it was a name
user = Repo.get_by(User, %{nickname: username_or_email}) ->
user
# If we get nil, we try using it as an email
user = Repo.get_by(User, %{email: username_or_email}) ->
user
end
}
end
end

View file

@ -0,0 +1,19 @@
defmodule Pleroma.Plugs.UserIsAdminPlug do
import Plug.Conn
alias Pleroma.User
def init(options) do
options
end
def call(%{assigns: %{user: %User{info: %{"is_admin" => true}}}} = conn, _) do
conn
end
def call(conn, _) do
conn
|> put_resp_content_type("application/json")
|> send_resp(403, Jason.encode!(%{error: "User is not admin."}))
|> halt
end
end

View file

@ -0,0 +1,343 @@
defmodule Pleroma.ReverseProxy do
@keep_req_headers ~w(accept user-agent accept-encoding cache-control if-modified-since if-unmodified-since if-none-match if-range range)
@resp_cache_headers ~w(etag date last-modified cache-control)
@keep_resp_headers @resp_cache_headers ++
~w(content-type content-disposition content-encoding content-range accept-ranges vary)
@default_cache_control_header "public, max-age=1209600"
@valid_resp_codes [200, 206, 304]
@max_read_duration :timer.seconds(30)
@max_body_length :infinity
@methods ~w(GET HEAD)
@moduledoc """
A reverse proxy.
Pleroma.ReverseProxy.call(conn, url, options)
It is not meant to be added into a plug pipeline, but to be called from another plug or controller.
Supports `#{inspect(@methods)}` HTTP methods, and only allows `#{inspect(@valid_resp_codes)}` status codes.
Responses are chunked to the client while downloading from the upstream.
Some request / responses headers are preserved:
* request: `#{inspect(@keep_req_headers)}`
* response: `#{inspect(@keep_resp_headers)}`
If no caching headers (`#{inspect(@resp_cache_headers)}`) are returned by upstream, `cache-control` will be
set to `#{inspect(@default_cache_control_header)}`.
Options:
* `redirect_on_failure` (default `false`). Redirects the client to the real remote URL if there's any HTTP
errors. Any error during body processing will not be redirected as the response is chunked. This may expose
remote URL, clients IPs, .
* `max_body_length` (default `#{inspect(@max_body_length)}`): limits the content length to be approximately the
specified length. It is validated with the `content-length` header and also verified when proxying.
* `max_read_duration` (default `#{inspect(@max_read_duration)}` ms): the total time the connection is allowed to
read from the remote upstream.
* `inline_content_types`:
* `true` will not alter `content-disposition` (up to the upstream),
* `false` will add `content-disposition: attachment` to any request,
* a list of whitelisted content types
* `keep_user_agent` will forward the client's user-agent to the upstream. This may be useful if the upstream is
doing content transformation (encoding, ) depending on the request.
* `req_headers`, `resp_headers` additional headers.
* `http`: options for [hackney](https://github.com/benoitc/hackney).
"""
@hackney Application.get_env(:pleroma, :hackney, :hackney)
@httpoison Application.get_env(:pleroma, :httpoison, HTTPoison)
@default_hackney_options [{:follow_redirect, true}]
@inline_content_types [
"image/gif",
"image/jpeg",
"image/jpg",
"image/png",
"image/svg+xml",
"audio/mpeg",
"audio/mp3",
"video/webm",
"video/mp4",
"video/quicktime"
]
require Logger
import Plug.Conn
@type option() ::
{:keep_user_agent, boolean}
| {:max_read_duration, :timer.time() | :infinity}
| {:max_body_length, non_neg_integer() | :infinity}
| {:http, []}
| {:req_headers, [{String.t(), String.t()}]}
| {:resp_headers, [{String.t(), String.t()}]}
| {:inline_content_types, boolean() | [String.t()]}
| {:redirect_on_failure, boolean()}
@spec call(Plug.Conn.t(), url :: String.t(), [option()]) :: Plug.Conn.t()
def call(conn = %{method: method}, url, opts \\ []) when method in @methods do
hackney_opts =
@default_hackney_options
|> Keyword.merge(Keyword.get(opts, :http, []))
|> @httpoison.process_request_options()
req_headers = build_req_headers(conn.req_headers, opts)
opts =
if filename = Pleroma.Web.MediaProxy.filename(url) do
Keyword.put_new(opts, :attachment_name, filename)
else
opts
end
with {:ok, code, headers, client} <- request(method, url, req_headers, hackney_opts),
:ok <- header_length_constraint(headers, Keyword.get(opts, :max_body_length)) do
response(conn, client, url, code, headers, opts)
else
{:ok, code, headers} ->
head_response(conn, url, code, headers, opts)
|> halt()
{:error, {:invalid_http_response, code}} ->
Logger.error("#{__MODULE__}: request to #{inspect(url)} failed with HTTP status #{code}")
conn
|> error_or_redirect(
url,
code,
"Request failed: " <> Plug.Conn.Status.reason_phrase(code),
opts
)
|> halt()
{:error, error} ->
Logger.error("#{__MODULE__}: request to #{inspect(url)} failed: #{inspect(error)}")
conn
|> error_or_redirect(url, 500, "Request failed", opts)
|> halt()
end
end
def call(conn, _, _) do
conn
|> send_resp(400, Plug.Conn.Status.reason_phrase(400))
|> halt()
end
defp request(method, url, headers, hackney_opts) do
Logger.debug("#{__MODULE__} #{method} #{url} #{inspect(headers)}")
method = method |> String.downcase() |> String.to_existing_atom()
case @hackney.request(method, url, headers, "", hackney_opts) do
{:ok, code, headers, client} when code in @valid_resp_codes ->
{:ok, code, downcase_headers(headers), client}
{:ok, code, headers} when code in @valid_resp_codes ->
{:ok, code, downcase_headers(headers)}
{:ok, code, _, _} ->
{:error, {:invalid_http_response, code}}
{:error, error} ->
{:error, error}
end
end
defp response(conn, client, url, status, headers, opts) do
result =
conn
|> put_resp_headers(build_resp_headers(headers, opts))
|> send_chunked(status)
|> chunk_reply(client, opts)
case result do
{:ok, conn} ->
halt(conn)
{:error, :closed, conn} ->
:hackney.close(client)
halt(conn)
{:error, error, conn} ->
Logger.warn(
"#{__MODULE__} request to #{url} failed while reading/chunking: #{inspect(error)}"
)
:hackney.close(client)
halt(conn)
end
end
defp chunk_reply(conn, client, opts) do
chunk_reply(conn, client, opts, 0, 0)
end
defp chunk_reply(conn, client, opts, sent_so_far, duration) do
with {:ok, duration} <-
check_read_duration(
duration,
Keyword.get(opts, :max_read_duration, @max_read_duration)
),
{:ok, data} <- @hackney.stream_body(client),
{:ok, duration} <- increase_read_duration(duration),
sent_so_far = sent_so_far + byte_size(data),
:ok <- body_size_constraint(sent_so_far, Keyword.get(opts, :max_body_size)),
{:ok, conn} <- chunk(conn, data) do
chunk_reply(conn, client, opts, sent_so_far, duration)
else
:done -> {:ok, conn}
{:error, error} -> {:error, error, conn}
end
end
defp head_response(conn, _url, code, headers, opts) do
conn
|> put_resp_headers(build_resp_headers(headers, opts))
|> send_resp(code, "")
end
defp error_or_redirect(conn, url, code, body, opts) do
if Keyword.get(opts, :redirect_on_failure, false) do
conn
|> Phoenix.Controller.redirect(external: url)
|> halt()
else
conn
|> send_resp(code, body)
|> halt
end
end
defp downcase_headers(headers) do
Enum.map(headers, fn {k, v} ->
{String.downcase(k), v}
end)
end
defp get_content_type(headers) do
{_, content_type} =
List.keyfind(headers, "content-type", 0, {"content-type", "application/octet-stream"})
[content_type | _] = String.split(content_type, ";")
content_type
end
defp put_resp_headers(conn, headers) do
Enum.reduce(headers, conn, fn {k, v}, conn ->
put_resp_header(conn, k, v)
end)
end
defp build_req_headers(headers, opts) do
headers =
headers
|> downcase_headers()
|> Enum.filter(fn {k, _} -> k in @keep_req_headers end)
|> (fn headers ->
headers = headers ++ Keyword.get(opts, :req_headers, [])
if Keyword.get(opts, :keep_user_agent, false) do
List.keystore(
headers,
"user-agent",
0,
{"user-agent", Pleroma.Application.user_agent()}
)
else
headers
end
end).()
end
defp build_resp_headers(headers, opts) do
headers
|> Enum.filter(fn {k, _} -> k in @keep_resp_headers end)
|> build_resp_cache_headers(opts)
|> build_resp_content_disposition_header(opts)
|> (fn headers -> headers ++ Keyword.get(opts, :resp_headers, []) end).()
end
defp build_resp_cache_headers(headers, opts) do
has_cache? = Enum.any?(headers, fn {k, _} -> k in @resp_cache_headers end)
if has_cache? do
headers
else
List.keystore(headers, "cache-control", 0, {"cache-control", @default_cache_control_header})
end
end
defp build_resp_content_disposition_header(headers, opts) do
opt = Keyword.get(opts, :inline_content_types, @inline_content_types)
content_type = get_content_type(headers)
attachment? =
cond do
is_list(opt) && !Enum.member?(opt, content_type) -> true
opt == false -> true
true -> false
end
if attachment? do
disposition = "attachment; filename=" <> Keyword.get(opts, :attachment_name, "attachment")
List.keystore(headers, "content-disposition", 0, {"content-disposition", disposition})
else
headers
end
end
defp header_length_constraint(headers, limit) when is_integer(limit) and limit > 0 do
with {_, size} <- List.keyfind(headers, "content-length", 0),
{size, _} <- Integer.parse(size),
true <- size <= limit do
:ok
else
false ->
{:error, :body_too_large}
_ ->
:ok
end
end
defp header_length_constraint(_, _), do: :ok
defp body_size_constraint(size, limit) when is_integer(limit) and limit > 0 and size >= limit do
{:error, :body_too_large}
end
defp body_size_constraint(_, _), do: :ok
defp check_read_duration(duration, max)
when is_integer(duration) and is_integer(max) and max > 0 do
if duration > max do
{:error, :read_duration_exceeded}
else
{:ok, {duration, :erlang.system_time(:millisecond)}}
end
end
defp check_read_duration(_, _), do: {:ok, :no_duration_limit, :no_duration_limit}
defp increase_read_duration({previous_duration, started})
when is_integer(previous_duration) and is_integer(started) do
duration = :erlang.system_time(:millisecond) - started
{:ok, previous_duration + duration}
end
defp increase_read_duration(_) do
{:ok, :no_duration_limit, :no_duration_limit}
end
end

View file

@ -1,206 +1,222 @@
defmodule Pleroma.Upload do
@moduledoc """
# Upload
Options:
* `:type`: presets for activity type (defaults to Document) and size limits from app configuration
* `:description`: upload alternative text
* `:base_url`: override base url
* `:uploader`: override uploader
* `:filters`: override filters
* `:size_limit`: override size limit
* `:activity_type`: override activity type
The `%Pleroma.Upload{}` struct: all documented fields are meant to be overwritten in filters:
* `:id` - the upload id.
* `:name` - the upload file name.
* `:path` - the upload path: set at first to `id/name` but can be changed. Keep in mind that the path
is once created permanent and changing it (especially in uploaders) is probably a bad idea!
* `:tempfile` - path to the temporary file. Prefer in-place changes on the file rather than changing the
path as the temporary file is also tracked by `Plug.Upload{}` and automatically deleted once the request is over.
Related behaviors:
* `Pleroma.Uploaders.Uploader`
* `Pleroma.Upload.Filter`
"""
alias Ecto.UUID
alias Pleroma.Web
require Logger
def store(%Plug.Upload{} = file, should_dedupe) do
content_type = get_content_type(file.path)
uuid = get_uuid(file, should_dedupe)
name = get_name(file, uuid, content_type, should_dedupe)
upload_folder = get_upload_path(uuid, should_dedupe)
url_path = get_url(name, uuid, should_dedupe)
@type source ::
Plug.Upload.t() | data_uri_string ::
String.t() | {:from_local, name :: String.t(), id :: String.t(), path :: String.t()}
File.mkdir_p!(upload_folder)
result_file = Path.join(upload_folder, name)
@type option ::
{:type, :avatar | :banner | :background}
| {:description, String.t()}
| {:activity_type, String.t()}
| {:size_limit, nil | non_neg_integer()}
| {:uploader, module()}
| {:filters, [module()]}
if File.exists?(result_file) do
File.rm!(file.path)
else
File.cp!(file.path, result_file)
end
strip_exif_data(content_type, result_file)
%{
"type" => "Document",
"url" => [
%{
"type" => "Link",
"mediaType" => content_type,
"href" => url_path
@type t :: %__MODULE__{
id: String.t(),
name: String.t(),
tempfile: String.t(),
content_type: String.t(),
path: String.t()
}
],
"name" => name
}
defstruct [:id, :name, :tempfile, :content_type, :path]
@spec store(source, options :: [option()]) :: {:ok, Map.t()} | {:error, any()}
def store(upload, opts \\ []) do
opts = get_opts(opts)
with {:ok, upload} <- prepare_upload(upload, opts),
upload = %__MODULE__{upload | path: upload.path || "#{upload.id}/#{upload.name}"},
{:ok, upload} <- Pleroma.Upload.Filter.filter(opts.filters, upload),
{:ok, url_spec} <- Pleroma.Uploaders.Uploader.put_file(opts.uploader, upload) do
{:ok,
%{
"type" => opts.activity_type,
"url" => [
%{
"type" => "Link",
"mediaType" => upload.content_type,
"href" => url_from_spec(opts.base_url, url_spec)
}
],
"name" => Map.get(opts, :description) || upload.name
}}
else
{:error, error} ->
Logger.error(
"#{__MODULE__} store (using #{inspect(opts.uploader)}) failed: #{inspect(error)}"
)
{:error, error}
end
end
def store(%{"img" => "data:image/" <> image_data}, should_dedupe) do
defp get_opts(opts) do
{size_limit, activity_type} =
case Keyword.get(opts, :type) do
:banner ->
{Pleroma.Config.get!([:instance, :banner_upload_limit]), "Image"}
:avatar ->
{Pleroma.Config.get!([:instance, :avatar_upload_limit]), "Image"}
:background ->
{Pleroma.Config.get!([:instance, :background_upload_limit]), "Image"}
_ ->
{Pleroma.Config.get!([:instance, :upload_limit]), "Document"}
end
opts = %{
activity_type: Keyword.get(opts, :activity_type, activity_type),
size_limit: Keyword.get(opts, :size_limit, size_limit),
uploader: Keyword.get(opts, :uploader, Pleroma.Config.get([__MODULE__, :uploader])),
filters: Keyword.get(opts, :filters, Pleroma.Config.get([__MODULE__, :filters])),
description: Keyword.get(opts, :description),
base_url:
Keyword.get(
opts,
:base_url,
Pleroma.Config.get([__MODULE__, :base_url], Pleroma.Web.base_url())
)
}
# TODO: 1.0+ : remove old config compatibility
opts =
if Pleroma.Config.get([__MODULE__, :strip_exif]) == true &&
!Enum.member?(opts.filters, Pleroma.Upload.Filter.Mogrify) do
Logger.warn("""
Pleroma: configuration `:instance, :strip_exif` is deprecated, please instead set:
:pleroma, Pleroma.Upload, [filters: [Pleroma.Upload.Filter.Mogrify]]
:pleroma, Pleroma.Upload.Filter.Mogrify, args: "strip"
""")
Pleroma.Config.put([Pleroma.Upload.Filter.Mogrify], args: "strip")
Map.put(opts, :filters, opts.filters ++ [Pleroma.Upload.Filter.Mogrify])
else
opts
end
opts =
if Pleroma.Config.get([:instance, :dedupe_media]) == true &&
!Enum.member?(opts.filters, Pleroma.Upload.Filter.Dedupe) do
Logger.warn("""
Pleroma: configuration `:instance, :dedupe_media` is deprecated, please instead set:
:pleroma, Pleroma.Upload, [filters: [Pleroma.Upload.Filter.Dedupe]]
""")
Map.put(opts, :filters, opts.filters ++ [Pleroma.Upload.Filter.Dedupe])
else
opts
end
end
defp prepare_upload(%Plug.Upload{} = file, opts) do
with :ok <- check_file_size(file.path, opts.size_limit),
{:ok, content_type, name} <- Pleroma.MIME.file_mime_type(file.path, file.filename) do
{:ok,
%__MODULE__{
id: UUID.generate(),
name: name,
tempfile: file.path,
content_type: content_type
}}
end
end
defp prepare_upload(%{"img" => "data:image/" <> image_data}, opts) do
parsed = Regex.named_captures(~r/(?<filetype>jpeg|png|gif);base64,(?<data>.*)/, image_data)
data = Base.decode64!(parsed["data"], ignore: :whitespace)
uuid = UUID.generate()
uuidpath = Path.join(upload_path(), uuid)
uuid = UUID.generate()
hash = String.downcase(Base.encode16(:crypto.hash(:sha256, data)))
File.mkdir_p!(upload_path())
with :ok <- check_binary_size(data, opts.size_limit),
tmp_path <- tempfile_for_image(data),
{:ok, content_type, name} <-
Pleroma.MIME.bin_mime_type(data, hash <> "." <> parsed["filetype"]) do
{:ok,
%__MODULE__{
id: UUID.generate(),
name: name,
tempfile: tmp_path,
content_type: content_type
}}
end
end
File.write!(uuidpath, data)
# For Mix.Tasks.MigrateLocalUploads
defp prepare_upload(upload = %__MODULE__{tempfile: path}, _opts) do
with {:ok, content_type} <- Pleroma.MIME.file_mime_type(path) do
{:ok, %__MODULE__{upload | content_type: content_type}}
end
end
content_type = get_content_type(uuidpath)
defp check_binary_size(binary, size_limit)
when is_integer(size_limit) and size_limit > 0 and byte_size(binary) >= size_limit do
{:error, :file_too_large}
end
name =
create_name(
String.downcase(Base.encode16(:crypto.hash(:sha256, data))),
parsed["filetype"],
content_type
)
defp check_binary_size(_, _), do: :ok
upload_folder = get_upload_path(uuid, should_dedupe)
url_path = get_url(name, uuid, should_dedupe)
File.mkdir_p!(upload_folder)
result_file = Path.join(upload_folder, name)
if should_dedupe do
if !File.exists?(result_file) do
File.rename(uuidpath, result_file)
else
File.rm!(uuidpath)
end
defp check_file_size(path, size_limit) when is_integer(size_limit) and size_limit > 0 do
with {:ok, %{size: size}} <- File.stat(path),
true <- size <= size_limit do
:ok
else
File.rename(uuidpath, result_file)
end
strip_exif_data(content_type, result_file)
%{
"type" => "Image",
"url" => [
%{
"type" => "Link",
"mediaType" => content_type,
"href" => url_path
}
],
"name" => name
}
end
def strip_exif_data(content_type, file) do
settings = Application.get_env(:pleroma, Pleroma.Upload)
do_strip = Keyword.fetch!(settings, :strip_exif)
[filetype, ext] = String.split(content_type, "/")
if filetype == "image" and do_strip == true do
Mogrify.open(file) |> Mogrify.custom("strip") |> Mogrify.save(in_place: true)
false -> {:error, :file_too_large}
error -> error
end
end
def upload_path do
settings = Application.get_env(:pleroma, Pleroma.Upload)
Keyword.fetch!(settings, :uploads)
defp check_file_size(_, _), do: :ok
# Creates a tempfile using the Plug.Upload Genserver which cleans them up
# automatically.
defp tempfile_for_image(data) do
{:ok, tmp_path} = Plug.Upload.random_file("profile_pics")
{:ok, tmp_file} = File.open(tmp_path, [:write, :raw, :binary])
IO.binwrite(tmp_file, data)
tmp_path
end
defp create_name(uuid, ext, type) do
case type do
"application/octet-stream" ->
String.downcase(Enum.join([uuid, ext], "."))
"audio/mpeg" ->
String.downcase(Enum.join([uuid, "mp3"], "."))
_ ->
String.downcase(Enum.join([uuid, List.last(String.split(type, "/"))], "."))
end
defp url_from_spec(base_url, {:file, path}) do
[base_url, "media", path]
|> Path.join()
end
defp get_uuid(file, should_dedupe) do
if should_dedupe do
Base.encode16(:crypto.hash(:sha256, File.read!(file.path)))
else
UUID.generate()
end
end
defp get_name(file, uuid, type, should_dedupe) do
if should_dedupe do
create_name(uuid, List.last(String.split(file.filename, ".")), type)
else
parts = String.split(file.filename, ".")
new_filename =
if length(parts) > 1 do
Enum.drop(parts, -1) |> Enum.join(".")
else
Enum.join(parts)
end
case type do
"application/octet-stream" -> file.filename
"audio/mpeg" -> new_filename <> ".mp3"
"image/jpeg" -> new_filename <> ".jpg"
_ -> Enum.join([new_filename, String.split(type, "/") |> List.last()], ".")
end
end
end
defp get_upload_path(uuid, should_dedupe) do
if should_dedupe do
upload_path()
else
Path.join(upload_path(), uuid)
end
end
defp get_url(name, uuid, should_dedupe) do
if should_dedupe do
url_for(:cow_uri.urlencode(name))
else
url_for(Path.join(uuid, :cow_uri.urlencode(name)))
end
end
defp url_for(file) do
"#{Web.base_url()}/media/#{file}"
end
def get_content_type(file) do
match =
File.open(file, [:read], fn f ->
case IO.binread(f, 8) do
<<0x89, 0x50, 0x4E, 0x47, 0x0D, 0x0A, 0x1A, 0x0A>> ->
"image/png"
<<0x47, 0x49, 0x46, 0x38, _, 0x61, _, _>> ->
"image/gif"
<<0xFF, 0xD8, 0xFF, _, _, _, _, _>> ->
"image/jpeg"
<<0x1A, 0x45, 0xDF, 0xA3, _, _, _, _>> ->
"video/webm"
<<0x00, 0x00, 0x00, _, 0x66, 0x74, 0x79, 0x70>> ->
"video/mp4"
<<0x49, 0x44, 0x33, _, _, _, _, _>> ->
"audio/mpeg"
<<255, 251, _, 68, 0, 0, 0, 0>> ->
"audio/mpeg"
<<0x4F, 0x67, 0x67, 0x53, 0x00, 0x02, 0x00, 0x00>> ->
"audio/ogg"
<<0x52, 0x49, 0x46, 0x46, _, _, _, _>> ->
"audio/wav"
_ ->
"application/octet-stream"
end
end)
case match do
{:ok, type} -> type
_e -> "application/octet-stream"
end
defp url_from_spec({:url, url}) do
url
end
end

View file

@ -0,0 +1,35 @@
defmodule Pleroma.Upload.Filter do
@moduledoc """
Upload Filter behaviour
This behaviour allows to run filtering actions just before a file is uploaded. This allows to:
* morph in place the temporary file
* change any field of a `Pleroma.Upload` struct
* cancel/stop the upload
"""
require Logger
@callback filter(Pleroma.Upload.t()) :: :ok | {:ok, Pleroma.Upload.t()} | {:error, any()}
@spec filter([module()], Pleroma.Upload.t()) :: {:ok, Pleroma.Upload.t()} | {:error, any()}
def filter([], upload) do
{:ok, upload}
end
def filter([filter | rest], upload) do
case filter.filter(upload) do
:ok ->
filter(rest, upload)
{:ok, upload} ->
filter(rest, upload)
error ->
Logger.error("#{__MODULE__}: Filter #{filter} failed: #{inspect(error)}")
error
end
end
end

View file

@ -0,0 +1,10 @@
defmodule Pleroma.Upload.Filter.AnonymizeFilename do
@moduledoc "Replaces the original filename with a randomly generated string."
@behaviour Pleroma.Upload.Filter
def filter(upload) do
extension = List.last(String.split(upload.name, "."))
string = Base.url_encode64(:crypto.strong_rand_bytes(10), padding: false)
{:ok, %Pleroma.Upload{upload | name: string <> "." <> extension}}
end
end

View file

@ -0,0 +1,10 @@
defmodule Pleroma.Upload.Filter.Dedupe do
@behaviour Pleroma.Upload.Filter
def filter(upload = %Pleroma.Upload{name: name, tempfile: path}) do
extension = String.split(name, ".") |> List.last()
shasum = :crypto.hash(:sha256, File.read!(upload.tempfile)) |> Base.encode16(case: :lower)
filename = shasum <> "." <> extension
{:ok, %Pleroma.Upload{upload | id: shasum, path: filename}}
end
end

View file

@ -0,0 +1,60 @@
defmodule Pleroma.Upload.Filter.Mogrifun do
@behaviour Pleroma.Upload.Filter
@filters [
{"implode", "1"},
{"-raise", "20"},
{"+raise", "20"},
[{"-interpolate", "nearest"}, {"-virtual-pixel", "mirror"}, {"-spread", "5"}],
"+polaroid",
{"-statistic", "Mode 10"},
{"-emboss", "0x1.1"},
{"-emboss", "0x2"},
{"-colorspace", "Gray"},
"-negate",
[{"-channel", "green"}, "-negate"],
[{"-channel", "red"}, "-negate"],
[{"-channel", "blue"}, "-negate"],
{"+level-colors", "green,gold"},
{"+level-colors", ",DodgerBlue"},
{"+level-colors", ",Gold"},
{"+level-colors", ",Lime"},
{"+level-colors", ",Red"},
{"+level-colors", ",DarkGreen"},
{"+level-colors", "firebrick,yellow"},
{"+level-colors", "'rgb(102,75,25)',lemonchiffon"},
[{"fill", "red"}, {"tint", "40"}],
[{"fill", "green"}, {"tint", "40"}],
[{"fill", "blue"}, {"tint", "40"}],
[{"fill", "yellow"}, {"tint", "40"}]
]
def filter(%Pleroma.Upload{tempfile: file, content_type: "image" <> _}) do
filter = Enum.random(@filters)
file
|> Mogrify.open()
|> mogrify_filter(filter)
|> Mogrify.save(in_place: true)
:ok
end
def filter(_), do: :ok
defp mogrify_filter(mogrify, [filter | rest]) do
mogrify
|> mogrify_filter(filter)
|> mogrify_filter(rest)
end
defp mogrify_filter(mogrify, []), do: mogrify
defp mogrify_filter(mogrify, {action, options}) do
Mogrify.custom(mogrify, action, options)
end
defp mogrify_filter(mogrify, string) when is_binary(string) do
Mogrify.custom(mogrify, string)
end
end

View file

@ -0,0 +1,37 @@
defmodule Pleroma.Upload.Filter.Mogrify do
@behaviour Pleroma.Uploader.Filter
@type conversion :: action :: String.t() | {action :: String.t(), opts :: String.t()}
@type conversions :: conversion() | [conversion()]
def filter(%Pleroma.Upload{tempfile: file, content_type: "image" <> _}) do
filters = Pleroma.Config.get!([__MODULE__, :args])
file
|> Mogrify.open()
|> mogrify_filter(filters)
|> Mogrify.save(in_place: true)
:ok
end
def filter(_), do: :ok
defp mogrify_filter(mogrify, nil), do: mogrify
defp mogrify_filter(mogrify, [filter | rest]) do
mogrify
|> mogrify_filter(filter)
|> mogrify_filter(rest)
end
defp mogrify_filter(mogrify, []), do: mogrify
defp mogrify_filter(mogrify, {action, options}) do
Mogrify.custom(mogrify, action, options)
end
defp mogrify_filter(mogrify, action) when is_binary(action) do
Mogrify.custom(mogrify, action)
end
end

View file

@ -0,0 +1,34 @@
defmodule Pleroma.Uploaders.Local do
@behaviour Pleroma.Uploaders.Uploader
alias Pleroma.Web
def get_file(_) do
{:ok, {:static_dir, upload_path()}}
end
def put_file(upload) do
{local_path, file} =
case Enum.reverse(String.split(upload.path, "/", trim: true)) do
[file] ->
{upload_path(), file}
[file | folders] ->
path = Path.join([upload_path()] ++ Enum.reverse(folders))
File.mkdir_p!(path)
{path, file}
end
result_file = Path.join(local_path, file)
unless File.exists?(result_file) do
File.cp!(upload.tempfile, result_file)
end
:ok
end
def upload_path do
Pleroma.Config.get!([__MODULE__, :uploads])
end
end

View file

@ -0,0 +1,31 @@
defmodule Pleroma.Uploaders.MDII do
alias Pleroma.Config
@behaviour Pleroma.Uploaders.Uploader
@httpoison Application.get_env(:pleroma, :httpoison)
# MDII-hosted images are never passed through the MediaPlug; only local media.
# Delegate to Pleroma.Uploaders.Local
def get_file(file) do
Pleroma.Uploaders.Local.get_file(file)
end
def put_file(upload) do
cgi = Pleroma.Config.get([Pleroma.Uploaders.MDII, :cgi])
files = Pleroma.Config.get([Pleroma.Uploaders.MDII, :files])
{:ok, file_data} = File.read(upload.tempfile)
extension = String.split(upload.name, ".") |> List.last()
query = "#{cgi}?#{extension}"
with {:ok, %{status_code: 200, body: body}} <- @httpoison.post(query, file_data) do
remote_file_name = String.split(body) |> List.first()
public_url = "#{files}/#{remote_file_name}.#{extension}"
{:ok, {:url, public_url}}
else
_ -> Pleroma.Uploaders.Local.put_file(upload)
end
end
end

View file

@ -0,0 +1,46 @@
defmodule Pleroma.Uploaders.S3 do
@behaviour Pleroma.Uploaders.Uploader
require Logger
# The file name is re-encoded with S3's constraints here to comply with previous links with less strict filenames
def get_file(file) do
config = Pleroma.Config.get([__MODULE__])
{:ok,
{:url,
Path.join([
Keyword.fetch!(config, :public_endpoint),
Keyword.fetch!(config, :bucket),
strict_encode(URI.decode(file))
])}}
end
def put_file(upload = %Pleroma.Upload{}) do
config = Pleroma.Config.get([__MODULE__])
bucket = Keyword.get(config, :bucket)
{:ok, file_data} = File.read(upload.tempfile)
s3_name = strict_encode(upload.path)
op =
ExAws.S3.put_object(bucket, s3_name, file_data, [
{:acl, :public_read},
{:content_type, upload.content_type}
])
case ExAws.request(op) do
{:ok, _} ->
{:ok, {:file, s3_name}}
error ->
Logger.error("#{__MODULE__}: #{inspect(error)}")
{:error, "S3 Upload failed"}
end
end
@regex Regex.compile!("[^0-9a-zA-Z!.*/'()_-]")
def strict_encode(name) do
String.replace(name, @regex, "-")
end
end

View file

@ -0,0 +1,47 @@
defmodule Pleroma.Uploaders.Swift.Keystone do
use HTTPoison.Base
def process_url(url) do
Enum.join(
[Pleroma.Config.get!([Pleroma.Uploaders.Swift, :auth_url]), url],
"/"
)
end
def process_response_body(body) do
body
|> Poison.decode!()
end
def get_token() do
settings = Pleroma.Config.get(Pleroma.Uploaders.Swift)
username = Keyword.fetch!(settings, :username)
password = Keyword.fetch!(settings, :password)
tenant_id = Keyword.fetch!(settings, :tenant_id)
case post(
"/tokens",
make_auth_body(username, password, tenant_id),
["Content-Type": "application/json"],
hackney: [:insecure]
) do
{:ok, %HTTPoison.Response{status_code: 200, body: body}} ->
body["access"]["token"]["id"]
{:ok, %HTTPoison.Response{status_code: _}} ->
""
end
end
def make_auth_body(username, password, tenant) do
Poison.encode!(%{
:auth => %{
:passwordCredentials => %{
:username => username,
:password => password
},
:tenantId => tenant
}
})
end
end

View file

@ -0,0 +1,26 @@
defmodule Pleroma.Uploaders.Swift.Client do
use HTTPoison.Base
def process_url(url) do
Enum.join(
[Pleroma.Config.get!([Pleroma.Uploaders.Swift, :storage_url]), url],
"/"
)
end
def upload_file(filename, body, content_type) do
object_url = Pleroma.Config.get!([Pleroma.Uploaders.Swift, :object_url])
token = Pleroma.Uploaders.Swift.Keystone.get_token()
case put("#{filename}", body, "X-Auth-Token": token, "Content-Type": content_type) do
{:ok, %HTTPoison.Response{status_code: 201}} ->
{:ok, {:file, filename}}
{:ok, %HTTPoison.Response{status_code: 401}} ->
{:error, "Unauthorized, Bad Token"}
{:error, _} ->
{:error, "Swift Upload Error"}
end
end
end

View file

@ -0,0 +1,15 @@
defmodule Pleroma.Uploaders.Swift do
@behaviour Pleroma.Uploaders.Uploader
def get_file(name) do
{:ok, {:url, Path.join([Pleroma.Config.get!([__MODULE__, :object_url]), name])}}
end
def put_file(upload) do
Pleroma.Uploaders.Swift.Client.upload_file(
upload.path,
File.read!(upload.tmpfile),
upload.content_type
)
end
end

View file

@ -0,0 +1,40 @@
defmodule Pleroma.Uploaders.Uploader do
@moduledoc """
Defines the contract to put and get an uploaded file to any backend.
"""
@doc """
Instructs how to get the file from the backend.
Used by `Pleroma.Plugs.UploadedMedia`.
"""
@type get_method :: {:static_dir, directory :: String.t()} | {:url, url :: String.t()}
@callback get_file(file :: String.t()) :: {:ok, get_method()}
@doc """
Put a file to the backend.
Returns:
* `:ok` which assumes `{:ok, upload.path}`
* `{:ok, spec}` where spec is:
* `{:file, filename :: String.t}` to handle reads with `get_file/1` (recommended)
This allows to correctly proxy or redirect requests to the backend, while allowing to migrate backends without breaking any URL.
* `{url, url :: String.t}` to bypass `get_file/2` and use the `url` directly in the activity.
* `{:error, String.t}` error information if the file failed to be saved to the backend.
"""
@callback put_file(Pleroma.Upload.t()) ::
:ok | {:ok, {:file | :url, String.t()}} | {:error, String.t()}
@spec put_file(module(), Pleroma.Upload.t()) ::
{:ok, {:file | :url, String.t()}} | {:error, String.t()}
def put_file(uploader, upload) do
case uploader.put_file(upload) do
:ok -> {:ok, {:file, upload.path}}
other -> other
end
end
end

View file

@ -4,7 +4,7 @@ defmodule Pleroma.User do
import Ecto.{Changeset, Query}
alias Pleroma.{Repo, User, Object, Web, Activity, Notification}
alias Comeonin.Pbkdf2
alias Pleroma.Web.{OStatus, Websub}
alias Pleroma.Web.{OStatus, Websub, OAuth}
alias Pleroma.Web.ActivityPub.{Utils, ActivityPub}
schema "users" do
@ -22,6 +22,7 @@ defmodule Pleroma.User do
field(:info, :map, default: %{})
field(:follower_address, :string)
field(:search_distance, :float, virtual: true)
field(:last_refreshed_at, :naive_datetime)
has_many(:notifications, Notification)
timestamps()
@ -41,6 +42,10 @@ def banner_url(user) do
end
end
def profile_url(%User{info: %{"source_data" => %{"url" => url}}}), do: url
def profile_url(%User{ap_id: ap_id}), do: ap_id
def profile_url(_), do: nil
def ap_id(%User{nickname: nickname}) do
"#{Web.base_url()}/users/#{nickname}"
end
@ -68,7 +73,8 @@ def user_info(%User{} = user) do
following_count: length(user.following) - oneself,
note_count: user.info["note_count"] || 0,
follower_count: user.info["follower_count"] || 0,
locked: user.info["locked"] || false
locked: user.info["locked"] || false,
default_scope: user.info["default_scope"] || "public"
}
end
@ -77,7 +83,7 @@ def remote_user_creation(params) do
changes =
%User{}
|> cast(params, [:bio, :name, :ap_id, :nickname, :info, :avatar])
|> validate_required([:name, :ap_id, :nickname])
|> validate_required([:name, :ap_id])
|> unique_constraint(:nickname)
|> validate_format(:nickname, @email_regex)
|> validate_length(:bio, max: 5000)
@ -111,8 +117,12 @@ def update_changeset(struct, params \\ %{}) do
end
def upgrade_changeset(struct, params \\ %{}) do
params =
params
|> Map.put(:last_refreshed_at, NaiveDateTime.utc_now())
struct
|> cast(params, [:bio, :name, :info, :follower_address, :avatar])
|> cast(params, [:bio, :name, :info, :follower_address, :avatar, :last_refreshed_at])
|> unique_constraint(:nickname)
|> validate_format(:nickname, ~r/^[a-zA-Z\d]+$/)
|> validate_length(:bio, max: 5000)
@ -126,6 +136,9 @@ def password_update_changeset(struct, params) do
|> validate_required([:password, :password_confirmation])
|> validate_confirmation(:password)
OAuth.Token.delete_user_tokens(struct)
OAuth.Authorization.delete_user_authorizations(struct)
if changeset.valid? do
hashed = Pbkdf2.hashpwsalt(changeset.changes[:password])
@ -168,33 +181,26 @@ def register_changeset(struct, params \\ %{}) do
end
end
def maybe_direct_follow(%User{} = follower, %User{info: info} = followed) do
user_config = Application.get_env(:pleroma, :user)
deny_follow_blocked = Keyword.get(user_config, :deny_follow_blocked)
def needs_update?(%User{local: true}), do: false
user_info = user_info(followed)
def needs_update?(%User{local: false, last_refreshed_at: nil}), do: true
should_direct_follow =
cond do
# if the account is locked, don't pre-create the relationship
user_info[:locked] == true ->
false
def needs_update?(%User{local: false} = user) do
NaiveDateTime.diff(NaiveDateTime.utc_now(), user.last_refreshed_at) >= 86400
end
# if the users are blocking each other, we shouldn't even be here, but check for it anyway
deny_follow_blocked and
(User.blocks?(follower, followed) or User.blocks?(followed, follower)) ->
false
def needs_update?(_), do: true
# if OStatus, then there is no three-way handshake to follow
User.ap_enabled?(followed) != true ->
true
def maybe_direct_follow(%User{} = follower, %User{local: true, info: %{"locked" => true}}) do
{:ok, follower}
end
# if there are no other reasons not to, just pre-create the relationship
true ->
true
end
def maybe_direct_follow(%User{} = follower, %User{local: true} = followed) do
follow(follower, followed)
end
if should_direct_follow do
def maybe_direct_follow(%User{} = follower, %User{} = followed) do
if !User.ap_enabled?(followed) do
follow(follower, followed)
else
{:ok, follower}
@ -289,6 +295,7 @@ def update_and_set_cache(changeset) do
def invalidate_cache(user) do
Cachex.del(:user_cache, "ap_id:#{user.ap_id}")
Cachex.del(:user_cache, "nickname:#{user.nickname}")
Cachex.del(:user_cache, "user_info:#{user.id}")
end
def get_cached_by_ap_id(ap_id) do
@ -457,15 +464,25 @@ def update_follower_count(%User{} = user) do
update_and_set_cache(cs)
end
def get_notified_from_activity(%Activity{recipients: to}) do
query =
from(
u in User,
where: u.ap_id in ^to,
where: u.local == true
)
def get_users_from_set_query(ap_ids, false) do
from(
u in User,
where: u.ap_id in ^ap_ids
)
end
Repo.all(query)
def get_users_from_set_query(ap_ids, true) do
query = get_users_from_set_query(ap_ids, false)
from(
u in query,
where: u.local == true
)
end
def get_users_from_set(ap_ids, local_only \\ true) do
get_users_from_set_query(ap_ids, local_only)
|> Repo.all()
end
def get_recipients_from_activity(%Activity{recipients: to}) do
@ -481,7 +498,7 @@ def get_recipients_from_activity(%Activity{recipients: to}) do
Repo.all(query)
end
def search(query, resolve) do
def search(query, resolve \\ false) do
# strip the beginning @ off if there is a query
query = String.trim_leading(query, "@")
@ -500,7 +517,8 @@ def search(query, resolve) do
u.nickname,
u.name
)
}
},
where: not is_nil(u.nickname)
)
q =
@ -579,11 +597,23 @@ def unblock_domain(user, domain) do
end
def local_user_query() do
from(u in User, where: u.local == true)
from(
u in User,
where: u.local == true,
where: not is_nil(u.nickname)
)
end
def deactivate(%User{} = user) do
new_info = Map.put(user.info, "deactivated", true)
def moderator_user_query() do
from(
u in User,
where: u.local == true,
where: fragment("?->'is_moderator' @> 'true'", u.info)
)
end
def deactivate(%User{} = user, status \\ true) do
new_info = Map.put(user.info, "deactivated", status)
cs = User.info_changeset(user, %{info: new_info})
update_and_set_cache(cs)
end
@ -616,11 +646,19 @@ def delete(%User{} = user) do
end
end)
:ok
{:ok, user}
end
def html_filter_policy(%User{info: %{"no_rich_text" => true}}) do
Pleroma.HTML.Scrubber.TwitterText
end
def html_filter_policy(_), do: nil
def get_or_fetch_by_ap_id(ap_id) do
if user = get_by_ap_id(ap_id) do
user = get_by_ap_id(ap_id)
if !is_nil(user) and !User.needs_update?(user) do
user
else
ap_try = ActivityPub.make_user_from_ap_id(ap_id)
@ -638,6 +676,25 @@ def get_or_fetch_by_ap_id(ap_id) do
end
end
def get_or_create_instance_user do
relay_uri = "#{Pleroma.Web.Endpoint.url()}/relay"
if user = get_by_ap_id(relay_uri) do
user
else
changes =
%User{}
|> cast(%{}, [:ap_id, :nickname, :local])
|> put_change(:ap_id, relay_uri)
|> put_change(:nickname, nil)
|> put_change(:local, true)
|> put_change(:follower_address, relay_uri <> "/followers")
{:ok, user} = Repo.insert(changes)
user
end
end
# AP style
def public_key_from_info(%{
"source_data" => %{"publicKey" => %{"publicKeyPem" => public_key_pem}}
@ -676,6 +733,7 @@ def insert_or_update_user(data) do
Repo.insert(cs, on_conflict: :replace_all, conflict_target: :nickname)
end
def ap_enabled?(%User{local: true}), do: true
def ap_enabled?(%User{info: info}), do: info["ap_enabled"]
def ap_enabled?(_), do: false
@ -686,4 +744,28 @@ def get_or_fetch(uri_or_nickname) do
get_or_fetch_by_nickname(uri_or_nickname)
end
end
# wait a period of time and return newest version of the User structs
# this is because we have synchronous follow APIs and need to simulate them
# with an async handshake
def wait_and_refresh(_, %User{local: true} = a, %User{local: true} = b) do
with %User{} = a <- Repo.get(User, a.id),
%User{} = b <- Repo.get(User, b.id) do
{:ok, a, b}
else
_e ->
:error
end
end
def wait_and_refresh(timeout, %User{} = a, %User{} = b) do
with :ok <- :timer.sleep(timeout),
%User{} = a <- Repo.get(User, a.id),
%User{} = b <- Repo.get(User, b.id) do
{:ok, a, b}
else
_e ->
:error
end
end
end

View file

@ -10,16 +10,39 @@ defmodule Pleroma.Web.ActivityPub.ActivityPub do
@httpoison Application.get_env(:pleroma, :httpoison)
@instance Application.get_env(:pleroma, :instance)
# For Announce activities, we filter the recipients based on following status for any actors
# that match actual users. See issue #164 for more information about why this is necessary.
defp get_recipients(%{"type" => "Announce"} = data) do
to = data["to"] || []
cc = data["cc"] || []
recipients = to ++ cc
actor = User.get_cached_by_ap_id(data["actor"])
def get_recipients(data) do
(data["to"] || []) ++ (data["cc"] || [])
recipients
|> Enum.filter(fn recipient ->
case User.get_cached_by_ap_id(recipient) do
nil ->
true
user ->
User.following?(user, actor)
end
end)
{recipients, to, cc}
end
defp get_recipients(data) do
to = data["to"] || []
cc = data["cc"] || []
recipients = to ++ cc
{recipients, to, cc}
end
defp check_actor_is_active(actor) do
if not is_nil(actor) do
with user <- User.get_cached_by_ap_id(actor),
nil <- user.info["deactivated"] do
false <- !!user.info["deactivated"] do
:ok
else
_e -> :reject
@ -35,12 +58,14 @@ def insert(map, local \\ true) when is_map(map) do
:ok <- check_actor_is_active(map["actor"]),
{:ok, map} <- MRF.filter(map),
:ok <- insert_full_object(map) do
{recipients, _, _} = get_recipients(map)
{:ok, activity} =
Repo.insert(%Activity{
data: map,
local: local,
actor: map["actor"],
recipients: get_recipients(map)
recipients: recipients
})
Notification.create_notifications(activity)
@ -66,6 +91,11 @@ def stream_out(activity) do
Pleroma.Web.Streamer.stream("public:local", activity)
end
activity.data["object"]
|> Map.get("tag", [])
|> Enum.filter(fn tag -> is_bitstring(tag) end)
|> Enum.map(fn tag -> Pleroma.Web.Streamer.stream("hashtag:" <> tag, activity) end)
if activity.data["object"]["attachment"] != [] do
Pleroma.Web.Streamer.stream("public:media", activity)
@ -241,8 +271,7 @@ def delete(%Object{data: %{"id" => id, "actor" => actor}} = object, local \\ tru
"to" => [user.follower_address, "https://www.w3.org/ns/activitystreams#Public"]
}
with Repo.delete(object),
Repo.delete_all(Activity.all_non_create_by_object_ap_id_q(id)),
with {:ok, _} <- Object.delete(object),
{:ok, activity} <- insert(data, local),
:ok <- maybe_federate(activity),
{:ok, _actor} <- User.decrease_note_count(user) do
@ -381,6 +410,20 @@ defp restrict_tag(query, %{"tag" => tag}) do
defp restrict_tag(query, _), do: query
defp restrict_to_cc(query, recipients_to, recipients_cc) do
from(
activity in query,
where:
fragment(
"(?->'to' \\?| ?) or (?->'cc' \\?| ?)",
activity.data,
^recipients_to,
activity.data,
^recipients_cc
)
)
end
defp restrict_recipients(query, [], _user), do: query
defp restrict_recipients(query, recipients, nil) do
@ -522,9 +565,17 @@ def fetch_activities(recipients, opts \\ %{}) do
|> Enum.reverse()
end
def upload(file) do
data = Upload.store(file, Application.get_env(:pleroma, :instance)[:dedupe_media])
Repo.insert(%Object{data: data})
def fetch_activities_bounded(recipients_to, recipients_cc, opts \\ %{}) do
fetch_activities_query([], opts)
|> restrict_to_cc(recipients_to, recipients_cc)
|> Repo.all()
|> Enum.reverse()
end
def upload(file, opts \\ []) do
with {:ok, data} <- Upload.store(file, opts) do
Repo.insert(%Object{data: data})
end
end
def user_data_from_user_object(data) do
@ -554,19 +605,28 @@ def user_data_from_user_object(data) do
"locked" => locked
},
avatar: avatar,
nickname: "#{data["preferredUsername"]}@#{URI.parse(data["id"]).host}",
name: data["name"],
follower_address: data["followers"],
bio: data["summary"]
}
# nickname can be nil because of virtual actors
user_data =
if data["preferredUsername"] do
Map.put(
user_data,
:nickname,
"#{data["preferredUsername"]}@#{URI.parse(data["id"]).host}"
)
else
Map.put(user_data, :nickname, nil)
end
{:ok, user_data}
end
def fetch_and_prepare_user_from_ap_id(ap_id) do
with {:ok, %{status_code: 200, body: body}} <-
@httpoison.get(ap_id, [Accept: "application/activity+json"], follow_redirect: true),
{:ok, data} <- Jason.decode(body) do
with {:ok, data} <- fetch_and_contain_remote_object_from_id(ap_id) do
user_data_from_user_object(data)
else
e -> Logger.error("Could not decode user at fetch #{ap_id}, #{inspect(e)}")
@ -593,14 +653,12 @@ def make_user_from_nickname(nickname) do
end
end
@quarantined_instances Keyword.get(@instance, :quarantined_instances, [])
def should_federate?(inbox, public) do
if public do
true
else
inbox_info = URI.parse(inbox)
inbox_info.host not in @quarantined_instances
!Enum.member?(Pleroma.Config.get([:instance, :quarantined_instances], []), inbox_info.host)
end
end
@ -619,7 +677,7 @@ def publish(actor, activity) do
(Pleroma.Web.Salmon.remote_users(activity) ++ followers)
|> Enum.filter(fn user -> User.ap_enabled?(user) end)
|> Enum.map(fn %{info: %{"source_data" => data}} ->
(data["endpoints"] && data["endpoints"]["sharedInbox"]) || data["inbox"]
(is_map(data["endpoints"]) && Map.get(data["endpoints"], "sharedInbox")) || data["inbox"]
end)
|> Enum.uniq()
|> Enum.filter(fn inbox -> should_federate?(inbox, public) end)
@ -670,27 +728,22 @@ def fetch_object_from_id(id) do
else
Logger.info("Fetching #{id} via AP")
with true <- String.starts_with?(id, "http"),
{:ok, %{body: body, status_code: code}} when code in 200..299 <-
@httpoison.get(
id,
[Accept: "application/activity+json"],
follow_redirect: true,
timeout: 10000,
recv_timeout: 20000
),
{:ok, data} <- Jason.decode(body),
with {:ok, data} <- fetch_and_contain_remote_object_from_id(id),
nil <- Object.normalize(data),
params <- %{
"type" => "Create",
"to" => data["to"],
"cc" => data["cc"],
"actor" => data["attributedTo"],
"actor" => data["actor"] || data["attributedTo"],
"object" => data
},
:ok <- Transmogrifier.contain_origin(id, params),
{:ok, activity} <- Transmogrifier.handle_incoming(params) do
{:ok, Object.normalize(activity.data["object"])}
else
{:error, {:reject, nil}} ->
{:reject, nil}
object = %Object{} ->
{:ok, object}
@ -705,6 +758,27 @@ def fetch_object_from_id(id) do
end
end
def fetch_and_contain_remote_object_from_id(id) do
Logger.info("Fetching #{id} via AP")
with true <- String.starts_with?(id, "http"),
{:ok, %{body: body, status_code: code}} when code in 200..299 <-
@httpoison.get(
id,
[Accept: "application/activity+json"],
follow_redirect: true,
timeout: 10000,
recv_timeout: 20000
),
{:ok, data} <- Jason.decode(body),
:ok <- Transmogrifier.contain_origin_from_id(id, data) do
{:ok, data}
else
e ->
{:error, e}
end
end
def is_public?(activity) do
"https://www.w3.org/ns/activitystreams#Public" in (activity.data["to"] ++
(activity.data["cc"] || []))
@ -719,4 +793,38 @@ def visible_for_user?(activity, user) do
y = activity.data["to"] ++ (activity.data["cc"] || [])
visible_for_user?(activity, nil) || Enum.any?(x, &(&1 in y))
end
# guard
def entire_thread_visible_for_user?(nil, user), do: false
# child
def entire_thread_visible_for_user?(
%Activity{data: %{"object" => %{"inReplyTo" => parent_id}}} = tail,
user
)
when is_binary(parent_id) do
parent = Activity.get_in_reply_to_activity(tail)
visible_for_user?(tail, user) && entire_thread_visible_for_user?(parent, user)
end
# root
def entire_thread_visible_for_user?(tail, user), do: visible_for_user?(tail, user)
# filter out broken threads
def contain_broken_threads(%Activity{} = activity, %User{} = user) do
entire_thread_visible_for_user?(activity, user)
end
# do post-processing on a specific activity
def contain_activity(%Activity{} = activity, %User{} = user) do
contain_broken_threads(activity, user)
end
# do post-processing on a timeline
def contain_timeline(timeline, user) do
timeline
|> Enum.filter(fn activity ->
contain_activity(activity, user)
end)
end
end

View file

@ -3,12 +3,28 @@ defmodule Pleroma.Web.ActivityPub.ActivityPubController do
alias Pleroma.{User, Object}
alias Pleroma.Web.ActivityPub.{ObjectView, UserView}
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Relay
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.Federator
require Logger
action_fallback(:errors)
plug(Pleroma.Web.FederatingPlug when action in [:inbox, :relay])
plug(:relay_active? when action in [:relay])
def relay_active?(conn, _) do
if Keyword.get(Application.get_env(:pleroma, :instance), :allow_relay) do
conn
else
conn
|> put_status(404)
|> json(%{error: "not found"})
|> halt
end
end
def user(conn, %{"nickname" => nickname}) do
with %User{} = user <- User.get_cached_by_nickname(nickname),
{:ok, user} <- Pleroma.Web.WebFinger.ensure_keys_present(user) do
@ -86,25 +102,54 @@ def outbox(conn, %{"nickname" => nickname}) do
outbox(conn, %{"nickname" => nickname, "max_id" => nil})
end
# TODO: Ensure that this inbox is a recipient of the message
def inbox(%{assigns: %{valid_signature: true}} = conn, %{"nickname" => nickname} = params) do
with %User{} = user <- User.get_cached_by_nickname(nickname),
true <- Utils.recipient_in_message(user.ap_id, params),
params <- Utils.maybe_splice_recipient(user.ap_id, params) do
Federator.enqueue(:incoming_ap_doc, params)
json(conn, "ok")
end
end
def inbox(%{assigns: %{valid_signature: true}} = conn, params) do
Federator.enqueue(:incoming_ap_doc, params)
json(conn, "ok")
end
# only accept relayed Creates
def inbox(conn, %{"type" => "Create"} = params) do
Logger.info(
"Signature missing or not from author, relayed Create message, fetching object from source"
)
ActivityPub.fetch_object_from_id(params["object"]["id"])
json(conn, "ok")
end
def inbox(conn, params) do
headers = Enum.into(conn.req_headers, %{})
if !String.contains?(headers["signature"] || "", params["actor"]) do
Logger.info("Signature not from author, relayed message, fetching from source")
ActivityPub.fetch_object_from_id(params["object"]["id"])
else
Logger.info("Signature error - make sure you are forwarding the HTTP Host header!")
Logger.info("Could not validate #{params["actor"]}")
if String.contains?(headers["signature"], params["actor"]) do
Logger.info(
"Signature validation error for: #{params["actor"]}, make sure you are forwarding the HTTP Host header!"
)
Logger.info(inspect(conn.req_headers))
end
json(conn, "ok")
json(conn, "error")
end
def relay(conn, params) do
with %User{} = user <- Relay.get_actor(),
{:ok, user} <- Pleroma.Web.WebFinger.ensure_keys_present(user) do
conn
|> put_resp_header("content-type", "application/activity+json")
|> json(UserView.render("user.json", %{user: user}))
else
nil -> {:error, :not_found}
end
end
def errors(conn, {:error, :not_found}) do

View file

@ -0,0 +1,23 @@
defmodule Pleroma.Web.ActivityPub.MRF.NormalizeMarkup do
alias Pleroma.HTML
@behaviour Pleroma.Web.ActivityPub.MRF
def filter(%{"type" => activity_type} = object) when activity_type == "Create" do
scrub_policy = Pleroma.Config.get([:mrf_normalize_markup, :scrub_policy])
child = object["object"]
content =
child["content"]
|> HTML.filter_tags(scrub_policy)
child = Map.put(child, "content", content)
object = Map.put(object, "object", child)
{:ok, object}
end
def filter(object), do: {:ok, object}
end

View file

@ -2,48 +2,45 @@ defmodule Pleroma.Web.ActivityPub.MRF.RejectNonPublic do
alias Pleroma.User
@behaviour Pleroma.Web.ActivityPub.MRF
@mrf_rejectnonpublic Application.get_env(:pleroma, :mrf_rejectnonpublic)
@allow_followersonly Keyword.get(@mrf_rejectnonpublic, :allow_followersonly)
@allow_direct Keyword.get(@mrf_rejectnonpublic, :allow_direct)
@impl true
def filter(object) do
if object["type"] == "Create" do
user = User.get_cached_by_ap_id(object["actor"])
public = "https://www.w3.org/ns/activitystreams#Public"
def filter(%{"type" => "Create"} = object) do
user = User.get_cached_by_ap_id(object["actor"])
public = "https://www.w3.org/ns/activitystreams#Public"
# Determine visibility
visibility =
cond do
public in object["to"] -> "public"
public in object["cc"] -> "unlisted"
user.follower_address in object["to"] -> "followers"
true -> "direct"
# Determine visibility
visibility =
cond do
public in object["to"] -> "public"
public in object["cc"] -> "unlisted"
user.follower_address in object["to"] -> "followers"
true -> "direct"
end
policy = Pleroma.Config.get(:mrf_rejectnonpublic)
case visibility do
"public" ->
{:ok, object}
"unlisted" ->
{:ok, object}
"followers" ->
with true <- Keyword.get(policy, :allow_followersonly) do
{:ok, object}
else
_e -> {:reject, nil}
end
case visibility do
"public" ->
"direct" ->
with true <- Keyword.get(policy, :allow_direct) do
{:ok, object}
"unlisted" ->
{:ok, object}
"followers" ->
with true <- @allow_followersonly do
{:ok, object}
else
_e -> {:reject, nil}
end
"direct" ->
with true <- @allow_direct do
{:ok, object}
else
_e -> {:reject, nil}
end
end
else
{:ok, object}
else
_e -> {:reject, nil}
end
end
end
@impl true
def filter(object), do: {:ok, object}
end

View file

@ -2,81 +2,92 @@ defmodule Pleroma.Web.ActivityPub.MRF.SimplePolicy do
alias Pleroma.User
@behaviour Pleroma.Web.ActivityPub.MRF
@mrf_policy Application.get_env(:pleroma, :mrf_simple)
defp check_accept(%{host: actor_host} = _actor_info, object) do
accepts = Pleroma.Config.get([:mrf_simple, :accept])
@accept Keyword.get(@mrf_policy, :accept)
defp check_accept(actor_info, object) do
if length(@accept) > 0 and not (actor_info.host in @accept) do
cond do
accepts == [] -> {:ok, object}
actor_host == Pleroma.Config.get([Pleroma.Web.Endpoint, :url, :host]) -> {:ok, object}
Enum.member?(accepts, actor_host) -> {:ok, object}
true -> {:reject, nil}
end
end
defp check_reject(%{host: actor_host} = _actor_info, object) do
if Enum.member?(Pleroma.Config.get([:mrf_simple, :reject]), actor_host) do
{:reject, nil}
else
{:ok, object}
end
end
@reject Keyword.get(@mrf_policy, :reject)
defp check_reject(actor_info, object) do
if actor_info.host in @reject do
{:reject, nil}
else
{:ok, object}
end
defp check_media_removal(
%{host: actor_host} = _actor_info,
%{"type" => "Create", "object" => %{"attachement" => child_attachment}} = object
)
when length(child_attachment) > 0 do
object =
if Enum.member?(Pleroma.Config.get([:mrf_simple, :media_removal]), actor_host) do
child_object = Map.delete(object["object"], "attachment")
Map.put(object, "object", child_object)
else
object
end
{:ok, object}
end
@media_removal Keyword.get(@mrf_policy, :media_removal)
defp check_media_removal(actor_info, object) do
if actor_info.host in @media_removal do
child_object = Map.delete(object["object"], "attachment")
object = Map.put(object, "object", child_object)
{:ok, object}
else
{:ok, object}
end
defp check_media_removal(_actor_info, object), do: {:ok, object}
defp check_media_nsfw(
%{host: actor_host} = _actor_info,
%{
"type" => "Create",
"object" => %{"attachment" => child_attachment} = child_object
} = object
)
when length(child_attachment) > 0 do
object =
if Enum.member?(Pleroma.Config.get([:mrf_simple, :media_nsfw]), actor_host) do
tags = (child_object["tag"] || []) ++ ["nsfw"]
child_object = Map.put(child_object, "tags", tags)
child_object = Map.put(child_object, "sensitive", true)
Map.put(object, "object", child_object)
else
object
end
{:ok, object}
end
@media_nsfw Keyword.get(@mrf_policy, :media_nsfw)
defp check_media_nsfw(actor_info, object) do
child_object = object["object"]
defp check_media_nsfw(_actor_info, object), do: {:ok, object}
if actor_info.host in @media_nsfw and child_object["attachment"] != nil and
length(child_object["attachment"]) > 0 do
tags = (child_object["tag"] || []) ++ ["nsfw"]
child_object = Map.put(child_object, "tags", tags)
child_object = Map.put(child_object, "sensitive", true)
object = Map.put(object, "object", child_object)
{:ok, object}
else
{:ok, object}
end
end
defp check_ftl_removal(%{host: actor_host} = _actor_info, object) do
object =
with true <-
Enum.member?(
Pleroma.Config.get([:mrf_simple, :federated_timeline_removal]),
actor_host
),
user <- User.get_cached_by_ap_id(object["actor"]),
true <- "https://www.w3.org/ns/activitystreams#Public" in object["to"],
true <- user.follower_address in object["cc"] do
to =
List.delete(object["to"], "https://www.w3.org/ns/activitystreams#Public") ++
[user.follower_address]
@ftl_removal Keyword.get(@mrf_policy, :federated_timeline_removal)
defp check_ftl_removal(actor_info, object) do
if actor_info.host in @ftl_removal do
user = User.get_by_ap_id(object["actor"])
cc =
List.delete(object["cc"], user.follower_address) ++
["https://www.w3.org/ns/activitystreams#Public"]
# flip to/cc relationship to make the post unlisted
object =
if "https://www.w3.org/ns/activitystreams#Public" in object["to"] and
user.follower_address in object["cc"] do
to =
List.delete(object["to"], "https://www.w3.org/ns/activitystreams#Public") ++
[user.follower_address]
object
|> Map.put("to", to)
|> Map.put("cc", cc)
else
_ -> object
end
cc =
List.delete(object["cc"], user.follower_address) ++
["https://www.w3.org/ns/activitystreams#Public"]
object
|> Map.put("to", to)
|> Map.put("cc", cc)
else
object
end
{:ok, object}
else
{:ok, object}
end
{:ok, object}
end
@impl true

View file

@ -0,0 +1,23 @@
defmodule Pleroma.Web.ActivityPub.MRF.UserAllowListPolicy do
alias Pleroma.Config
@behaviour Pleroma.Web.ActivityPub.MRF
defp filter_by_list(object, []), do: {:ok, object}
defp filter_by_list(%{"actor" => actor} = object, allow_list) do
if actor in allow_list do
{:ok, object}
else
{:reject, nil}
end
end
@impl true
def filter(object) do
actor_info = URI.parse(object["actor"])
allow_list = Config.get([:mrf_user_allowlist, String.to_atom(actor_info.host)], [])
filter_by_list(object, allow_list)
end
end

View file

@ -0,0 +1,46 @@
defmodule Pleroma.Web.ActivityPub.Relay do
alias Pleroma.{User, Object, Activity}
alias Pleroma.Web.ActivityPub.ActivityPub
require Logger
def get_actor do
User.get_or_create_instance_user()
end
def follow(target_instance) do
with %User{} = local_user <- get_actor(),
%User{} = target_user <- User.get_or_fetch_by_ap_id(target_instance),
{:ok, activity} <- ActivityPub.follow(local_user, target_user) do
Logger.info("relay: followed instance: #{target_instance}; id=#{activity.data["id"]}")
{:ok, activity}
else
e ->
Logger.error("error: #{inspect(e)}")
{:error, e}
end
end
def unfollow(target_instance) do
with %User{} = local_user <- get_actor(),
%User{} = target_user <- User.get_or_fetch_by_ap_id(target_instance),
{:ok, activity} <- ActivityPub.unfollow(local_user, target_user) do
Logger.info("relay: unfollowed instance: #{target_instance}: id=#{activity.data["id"]}")
{:ok, activity}
else
e ->
Logger.error("error: #{inspect(e)}")
{:error, e}
end
end
def publish(%Activity{data: %{"type" => "Create"}} = activity) do
with %User{} = user <- get_actor(),
%Object{} = object <- Object.normalize(activity.data["object"]["id"]) do
ActivityPub.announce(user, object)
else
e -> Logger.error("error: #{inspect(e)}")
end
end
def publish(_), do: nil
end

View file

@ -21,13 +21,46 @@ def get_actor(%{"actor" => actor}) when is_list(actor) do
if is_binary(Enum.at(actor, 0)) do
Enum.at(actor, 0)
else
Enum.find(actor, fn %{"type" => type} -> type == "Person" end)
Enum.find(actor, fn %{"type" => type} -> type in ["Person", "Service", "Application"] end)
|> Map.get("id")
end
end
def get_actor(%{"actor" => actor}) when is_map(actor) do
actor["id"]
def get_actor(%{"actor" => %{"id" => id}}) when is_bitstring(id) do
id
end
def get_actor(%{"actor" => nil, "attributedTo" => actor}) when not is_nil(actor) do
get_actor(%{"actor" => actor})
end
@doc """
Checks that an imported AP object's actor matches the domain it came from.
"""
def contain_origin(id, %{"actor" => nil}), do: :error
def contain_origin(id, %{"actor" => actor} = params) do
id_uri = URI.parse(id)
actor_uri = URI.parse(get_actor(params))
if id_uri.host == actor_uri.host do
:ok
else
:error
end
end
def contain_origin_from_id(id, %{"id" => nil}), do: :error
def contain_origin_from_id(id, %{"id" => other_id} = params) do
id_uri = URI.parse(id)
other_uri = URI.parse(other_id)
if id_uri.host == other_uri.host do
:ok
else
:error
end
end
@doc """
@ -37,6 +70,7 @@ def fix_object(object) do
object
|> fix_actor
|> fix_attachments
|> fix_url
|> fix_context
|> fix_in_reply_to
|> fix_emoji
@ -82,9 +116,25 @@ def fix_likes(object) do
object
end
def fix_in_reply_to(%{"inReplyTo" => in_reply_to_id} = object)
when not is_nil(in_reply_to_id) do
case ActivityPub.fetch_object_from_id(in_reply_to_id) do
def fix_in_reply_to(%{"inReplyTo" => in_reply_to} = object)
when not is_nil(in_reply_to) do
in_reply_to_id =
cond do
is_bitstring(in_reply_to) ->
in_reply_to
is_map(in_reply_to) && is_bitstring(in_reply_to["id"]) ->
in_reply_to["id"]
is_list(in_reply_to) && is_bitstring(Enum.at(in_reply_to, 0)) ->
Enum.at(in_reply_to, 0)
# Maybe I should output an error too?
true ->
""
end
case fetch_obj_helper(in_reply_to_id) do
{:ok, replied_object} ->
with %Activity{} = activity <-
Activity.get_create_activity_by_object_ap_id(replied_object.data["id"]) do
@ -96,12 +146,12 @@ def fix_in_reply_to(%{"inReplyTo" => in_reply_to_id} = object)
|> Map.put("context", replied_object.data["context"] || object["conversation"])
else
e ->
Logger.error("Couldn't fetch #{object["inReplyTo"]} #{inspect(e)}")
Logger.error("Couldn't fetch \"#{inspect(in_reply_to_id)}\", error: #{inspect(e)}")
object
end
e ->
Logger.error("Couldn't fetch #{object["inReplyTo"]} #{inspect(e)}")
Logger.error("Couldn't fetch \"#{inspect(in_reply_to_id)}\", error: #{inspect(e)}")
object
end
end
@ -116,9 +166,9 @@ def fix_context(object) do
|> Map.put("conversation", context)
end
def fix_attachments(object) do
def fix_attachments(%{"attachment" => attachment} = object) when is_list(attachment) do
attachments =
(object["attachment"] || [])
attachment
|> Enum.map(fn data ->
url = [%{"type" => "Link", "mediaType" => data["mediaType"], "href" => data["url"]}]
Map.put(data, "url", url)
@ -128,21 +178,41 @@ def fix_attachments(object) do
|> Map.put("attachment", attachments)
end
def fix_emoji(object) do
tags = object["tag"] || []
def fix_attachments(%{"attachment" => attachment} = object) when is_map(attachment) do
Map.put(object, "attachment", [attachment])
|> fix_attachments()
end
def fix_attachments(object), do: object
def fix_url(%{"url" => url} = object) when is_map(url) do
object
|> Map.put("url", url["href"])
end
def fix_url(%{"url" => url} = object) when is_list(url) do
first_element = Enum.at(url, 0)
url_string =
cond do
is_bitstring(first_element) -> first_element
is_map(first_element) -> first_element["href"] || ""
true -> ""
end
object
|> Map.put("url", url_string)
end
def fix_url(object), do: object
def fix_emoji(%{"tag" => tags} = object) when is_list(tags) do
emoji = tags |> Enum.filter(fn data -> data["type"] == "Emoji" and data["icon"] end)
emoji =
emoji
|> Enum.reduce(%{}, fn data, mapping ->
name = data["name"]
name =
if String.starts_with?(name, ":") do
name |> String.slice(1..-2)
else
name
end
name = String.trim(data["name"], ":")
mapping |> Map.put(name, data["icon"]["url"])
end)
@ -154,18 +224,37 @@ def fix_emoji(object) do
|> Map.put("emoji", emoji)
end
def fix_tag(object) do
def fix_emoji(%{"tag" => %{"type" => "Emoji"} = tag} = object) do
name = String.trim(tag["name"], ":")
emoji = %{name => tag["icon"]["url"]}
object
|> Map.put("emoji", emoji)
end
def fix_emoji(object), do: object
def fix_tag(%{"tag" => tag} = object) when is_list(tag) do
tags =
(object["tag"] || [])
tag
|> Enum.filter(fn data -> data["type"] == "Hashtag" and data["name"] end)
|> Enum.map(fn data -> String.slice(data["name"], 1..-1) end)
combined = (object["tag"] || []) ++ tags
combined = tag ++ tags
object
|> Map.put("tag", combined)
end
def fix_tag(%{"tag" => %{"type" => "Hashtag", "name" => hashtag} = tag} = object) do
combined = [tag, String.slice(hashtag, 1..-1)]
object
|> Map.put("tag", combined)
end
def fix_tag(object), do: object
# content map usually only has one language so this will do for now.
def fix_content_map(%{"contentMap" => content_map} = object) do
content_groups = Map.to_list(content_map)
@ -187,7 +276,7 @@ def handle_incoming(%{"id" => id}) when not (is_binary(id) and length(id) > 8),
# - tags
# - emoji
def handle_incoming(%{"type" => "Create", "object" => %{"type" => objtype} = object} = data)
when objtype in ["Article", "Note", "Video"] do
when objtype in ["Article", "Note", "Video", "Page"] do
actor = get_actor(data)
data =
@ -271,8 +360,10 @@ defp get_follow_activity(follow_object, followed) do
def handle_incoming(
%{"type" => "Accept", "object" => follow_object, "actor" => actor, "id" => id} = data
) do
with %User{} = followed <- User.get_or_fetch_by_ap_id(actor),
with actor <- get_actor(data),
%User{} = followed <- User.get_or_fetch_by_ap_id(actor),
{:ok, follow_activity} <- get_follow_activity(follow_object, followed),
{:ok, follow_activity} <- Utils.update_follow_state(follow_activity, "accept"),
%User{local: true} = follower <- User.get_cached_by_ap_id(follow_activity.data["actor"]),
{:ok, activity} <-
ActivityPub.accept(%{
@ -295,8 +386,10 @@ def handle_incoming(
def handle_incoming(
%{"type" => "Reject", "object" => follow_object, "actor" => actor, "id" => id} = data
) do
with %User{} = followed <- User.get_or_fetch_by_ap_id(actor),
with actor <- get_actor(data),
%User{} = followed <- User.get_or_fetch_by_ap_id(actor),
{:ok, follow_activity} <- get_follow_activity(follow_object, followed),
{:ok, follow_activity} <- Utils.update_follow_state(follow_activity, "reject"),
%User{local: true} = follower <- User.get_cached_by_ap_id(follow_activity.data["actor"]),
{:ok, activity} <-
ActivityPub.accept(%{
@ -315,11 +408,11 @@ def handle_incoming(
end
def handle_incoming(
%{"type" => "Like", "object" => object_id, "actor" => actor, "id" => id} = _data
%{"type" => "Like", "object" => object_id, "actor" => actor, "id" => id} = data
) do
with %User{} = actor <- User.get_or_fetch_by_ap_id(actor),
{:ok, object} <-
get_obj_helper(object_id) || ActivityPub.fetch_object_from_id(object_id),
with actor <- get_actor(data),
%User{} = actor <- User.get_or_fetch_by_ap_id(actor),
{:ok, object} <- get_obj_helper(object_id) || fetch_obj_helper(object_id),
{:ok, activity, _object} <- ActivityPub.like(actor, object, id, false) do
{:ok, activity}
else
@ -328,11 +421,11 @@ def handle_incoming(
end
def handle_incoming(
%{"type" => "Announce", "object" => object_id, "actor" => actor, "id" => id} = _data
%{"type" => "Announce", "object" => object_id, "actor" => actor, "id" => id} = data
) do
with %User{} = actor <- User.get_or_fetch_by_ap_id(actor),
{:ok, object} <-
get_obj_helper(object_id) || ActivityPub.fetch_object_from_id(object_id),
with actor <- get_actor(data),
%User{} = actor <- User.get_or_fetch_by_ap_id(actor),
{:ok, object} <- get_obj_helper(object_id) || fetch_obj_helper(object_id),
{:ok, activity, _object} <- ActivityPub.announce(actor, object, id, false) do
{:ok, activity}
else
@ -341,9 +434,10 @@ def handle_incoming(
end
def handle_incoming(
%{"type" => "Update", "object" => %{"type" => "Person"} = object, "actor" => actor_id} =
%{"type" => "Update", "object" => %{"type" => object_type} = object, "actor" => actor_id} =
data
) do
)
when object_type in ["Person", "Application", "Service", "Organization"] do
with %User{ap_id: ^actor_id} = actor <- User.get_by_ap_id(object["id"]) do
{:ok, new_user_data} = ActivityPub.user_data_from_user_object(object)
@ -373,15 +467,20 @@ def handle_incoming(
end
end
# TODO: Make secure.
# TODO: We presently assume that any actor on the same origin domain as the object being
# deleted has the rights to delete that object. A better way to validate whether or not
# the object should be deleted is to refetch the object URI, which should return either
# an error or a tombstone. This would allow us to verify that a deletion actually took
# place.
def handle_incoming(
%{"type" => "Delete", "object" => object_id, "actor" => actor, "id" => _id} = _data
%{"type" => "Delete", "object" => object_id, "actor" => _actor, "id" => _id} = data
) do
object_id = Utils.get_ap_id(object_id)
with %User{} = _actor <- User.get_or_fetch_by_ap_id(actor),
{:ok, object} <-
get_obj_helper(object_id) || ActivityPub.fetch_object_from_id(object_id),
with actor <- get_actor(data),
%User{} = actor <- User.get_or_fetch_by_ap_id(actor),
{:ok, object} <- get_obj_helper(object_id) || fetch_obj_helper(object_id),
:ok <- contain_origin(actor.ap_id, object.data),
{:ok, activity} <- ActivityPub.delete(object, false) do
{:ok, activity}
else
@ -395,11 +494,11 @@ def handle_incoming(
"object" => %{"type" => "Announce", "object" => object_id},
"actor" => actor,
"id" => id
} = _data
} = data
) do
with %User{} = actor <- User.get_or_fetch_by_ap_id(actor),
{:ok, object} <-
get_obj_helper(object_id) || ActivityPub.fetch_object_from_id(object_id),
with actor <- get_actor(data),
%User{} = actor <- User.get_or_fetch_by_ap_id(actor),
{:ok, object} <- get_obj_helper(object_id) || fetch_obj_helper(object_id),
{:ok, activity, _} <- ActivityPub.unannounce(actor, object, id, false) do
{:ok, activity}
else
@ -425,9 +524,6 @@ def handle_incoming(
end
end
@ap_config Application.get_env(:pleroma, :activitypub)
@accept_blocks Keyword.get(@ap_config, :accept_blocks)
def handle_incoming(
%{
"type" => "Undo",
@ -436,7 +532,7 @@ def handle_incoming(
"id" => id
} = _data
) do
with true <- @accept_blocks,
with true <- Pleroma.Config.get([:activitypub, :accept_blocks]),
%User{local: true} = blocked <- User.get_cached_by_ap_id(blocked),
%User{} = blocker <- User.get_or_fetch_by_ap_id(blocker),
{:ok, activity} <- ActivityPub.unblock(blocker, blocked, id, false) do
@ -450,7 +546,7 @@ def handle_incoming(
def handle_incoming(
%{"type" => "Block", "object" => blocked, "actor" => blocker, "id" => id} = data
) do
with true <- @accept_blocks,
with true <- Pleroma.Config.get([:activitypub, :accept_blocks]),
%User{local: true} = blocked = User.get_cached_by_ap_id(blocked),
%User{} = blocker = User.get_or_fetch_by_ap_id(blocker),
{:ok, activity} <- ActivityPub.block(blocker, blocked, id, false) do
@ -468,11 +564,11 @@ def handle_incoming(
"object" => %{"type" => "Like", "object" => object_id},
"actor" => actor,
"id" => id
} = _data
} = data
) do
with %User{} = actor <- User.get_or_fetch_by_ap_id(actor),
{:ok, object} <-
get_obj_helper(object_id) || ActivityPub.fetch_object_from_id(object_id),
with actor <- get_actor(data),
%User{} = actor <- User.get_or_fetch_by_ap_id(actor),
{:ok, object} <- get_obj_helper(object_id) || fetch_obj_helper(object_id),
{:ok, activity, _, _} <- ActivityPub.unlike(actor, object, id, false) do
{:ok, activity}
else
@ -482,6 +578,9 @@ def handle_incoming(
def handle_incoming(_), do: :error
def fetch_obj_helper(id) when is_bitstring(id), do: ActivityPub.fetch_object_from_id(id)
def fetch_obj_helper(obj) when is_map(obj), do: ActivityPub.fetch_object_from_id(obj["id"])
def get_obj_helper(id) do
if object = Object.normalize(id), do: {:ok, object}, else: nil
end
@ -508,6 +607,8 @@ def prepare_object(object) do
|> prepare_attachments
|> set_conversation
|> set_reply_to_uri
|> strip_internal_fields
|> strip_internal_tags
end
# @doc
@ -523,7 +624,7 @@ def prepare_outgoing(%{"type" => "Create", "object" => %{"type" => "Note"} = obj
data =
data
|> Map.put("object", object)
|> Map.put("@context", "https://www.w3.org/ns/activitystreams")
|> Map.merge(Utils.make_json_ld_header())
{:ok, data}
end
@ -542,7 +643,7 @@ def prepare_outgoing(%{"type" => "Accept"} = data) do
data =
data
|> Map.put("object", object)
|> Map.put("@context", "https://www.w3.org/ns/activitystreams")
|> Map.merge(Utils.make_json_ld_header())
{:ok, data}
end
@ -560,7 +661,7 @@ def prepare_outgoing(%{"type" => "Reject"} = data) do
data =
data
|> Map.put("object", object)
|> Map.put("@context", "https://www.w3.org/ns/activitystreams")
|> Map.merge(Utils.make_json_ld_header())
{:ok, data}
end
@ -570,14 +671,14 @@ def prepare_outgoing(%{"type" => _type} = data) do
data =
data
|> maybe_fix_object_url
|> Map.put("@context", "https://www.w3.org/ns/activitystreams")
|> Map.merge(Utils.make_json_ld_header())
{:ok, data}
end
def maybe_fix_object_url(data) do
if is_binary(data["object"]) and not String.starts_with?(data["object"], "http") do
case ActivityPub.fetch_object_from_id(data["object"]) do
case fetch_obj_helper(data["object"]) do
{:ok, relative_object} ->
if relative_object.data["external_url"] do
_data =
@ -612,12 +713,9 @@ def add_hashtags(object) do
end
def add_mention_tags(object) do
recipients = object["to"] ++ (object["cc"] || [])
mentions =
recipients
|> Enum.map(fn ap_id -> User.get_cached_by_ap_id(ap_id) end)
|> Enum.filter(& &1)
object
|> Utils.get_notified_from_object()
|> Enum.map(fn user ->
%{"type" => "Mention", "href" => user.ap_id, "name" => "@#{user.nickname}"}
end)
@ -677,6 +775,29 @@ def prepare_attachments(object) do
|> Map.put("attachment", attachments)
end
defp strip_internal_fields(object) do
object
|> Map.drop([
"likes",
"like_count",
"announcements",
"announcement_count",
"emoji",
"context_id"
])
end
defp strip_internal_tags(%{"tag" => tags} = object) do
tags =
tags
|> Enum.filter(fn x -> is_map(x) end)
object
|> Map.put("tag", tags)
end
defp strip_internal_tags(object), do: object
defp user_upgrade_task(user) do
old_follower_address = User.ap_followers(user)

View file

@ -1,11 +1,13 @@
defmodule Pleroma.Web.ActivityPub.Utils do
alias Pleroma.{Repo, Web, Object, Activity, User}
alias Pleroma.{Repo, Web, Object, Activity, User, Notification}
alias Pleroma.Web.Router.Helpers
alias Pleroma.Web.Endpoint
alias Ecto.{Changeset, UUID}
import Ecto.Query
require Logger
@supported_object_types ["Article", "Note", "Video", "Page"]
# Some implementations send the actor URI as the actor field, others send the entire actor object,
# so figure out what the actor's URI is based on what we have.
def get_ap_id(object) do
@ -19,22 +21,58 @@ def normalize_params(params) do
Map.put(params, "actor", get_ap_id(params["actor"]))
end
defp recipient_in_collection(ap_id, coll) when is_binary(coll), do: ap_id == coll
defp recipient_in_collection(ap_id, coll) when is_list(coll), do: ap_id in coll
defp recipient_in_collection(_, _), do: false
def recipient_in_message(ap_id, params) do
cond do
recipient_in_collection(ap_id, params["to"]) ->
true
recipient_in_collection(ap_id, params["cc"]) ->
true
recipient_in_collection(ap_id, params["bto"]) ->
true
recipient_in_collection(ap_id, params["bcc"]) ->
true
# if the message is unaddressed at all, then assume it is directly addressed
# to the recipient
!params["to"] && !params["cc"] && !params["bto"] && !params["bcc"] ->
true
true ->
false
end
end
defp extract_list(target) when is_binary(target), do: [target]
defp extract_list(lst) when is_list(lst), do: lst
defp extract_list(_), do: []
def maybe_splice_recipient(ap_id, params) do
need_splice =
!recipient_in_collection(ap_id, params["to"]) &&
!recipient_in_collection(ap_id, params["cc"])
cc_list = extract_list(params["cc"])
if need_splice do
params
|> Map.put("cc", [ap_id | cc_list])
else
params
end
end
def make_json_ld_header do
%{
"@context" => [
"https://www.w3.org/ns/activitystreams",
"https://w3id.org/security/v1",
%{
"manuallyApprovesFollowers" => "as:manuallyApprovesFollowers",
"sensitive" => "as:sensitive",
"Hashtag" => "as:Hashtag",
"ostatus" => "http://ostatus.org#",
"atomUri" => "ostatus:atomUri",
"inReplyToAtomUri" => "ostatus:inReplyToAtomUri",
"conversation" => "ostatus:conversation",
"toot" => "http://joinmastodon.org/ns#",
"Emoji" => "toot:Emoji"
}
"#{Web.base_url()}/schemas/litepub-0.1.jsonld"
]
}
end
@ -59,6 +97,21 @@ def generate_id(type) do
"#{Web.base_url()}/#{type}/#{UUID.generate()}"
end
def get_notified_from_object(%{"type" => type} = object) when type in @supported_object_types do
fake_create_activity = %{
"to" => object["to"],
"cc" => object["cc"],
"type" => "Create",
"object" => object
}
Notification.get_notified_from_activity(%Activity{data: fake_create_activity}, false)
end
def get_notified_from_object(object) do
Notification.get_notified_from_activity(%Activity{data: object}, false)
end
def create_context(context) do
context = context || generate_id("contexts")
changeset = Object.context_mapping(context)
@ -128,7 +181,7 @@ def lazy_put_object_defaults(map, activity \\ %{}) do
Inserts a full object if it is contained in an activity.
"""
def insert_full_object(%{"object" => %{"type" => type} = object_data})
when is_map(object_data) and type in ["Article", "Note", "Video"] do
when is_map(object_data) and type in @supported_object_types do
with {:ok, _} <- Object.create(object_data) do
:ok
end
@ -247,11 +300,11 @@ def make_follow_data(
"actor" => follower_id,
"to" => [followed_id],
"cc" => ["https://www.w3.org/ns/activitystreams#Public"],
"object" => followed_id
"object" => followed_id,
"state" => "pending"
}
data = if activity_id, do: Map.put(data, "id", activity_id), else: data
data = if User.locked?(followed), do: Map.put(data, "state", "pending"), else: data
data
end
@ -306,6 +359,24 @@ def get_existing_announce(actor, %{data: %{"id" => id}}) do
@doc """
Make announce activity data for the given actor and object
"""
# for relayed messages, we only want to send to subscribers
def make_announce_data(
%User{ap_id: ap_id, nickname: nil} = user,
%Object{data: %{"id" => id}} = object,
activity_id
) do
data = %{
"type" => "Announce",
"actor" => ap_id,
"object" => id,
"to" => [user.follower_address],
"cc" => [],
"context" => object.data["context"]
}
if activity_id, do: Map.put(data, "id", activity_id), else: data
end
def make_announce_data(
%User{ap_id: ap_id} = user,
%Object{data: %{"id" => id}} = object,
@ -360,7 +431,12 @@ def make_unlike_data(
if activity_id, do: Map.put(data, "id", activity_id), else: data
end
def add_announce_to_object(%Activity{data: %{"actor" => actor}}, object) do
def add_announce_to_object(
%Activity{
data: %{"actor" => actor, "cc" => ["https://www.w3.org/ns/activitystreams#Public"]}
},
object
) do
announcements =
if is_list(object.data["announcements"]), do: object.data["announcements"], else: []
@ -369,6 +445,8 @@ def add_announce_to_object(%Activity{data: %{"actor" => actor}}, object) do
end
end
def add_announce_to_object(_, object), do: {:ok, object}
def remove_announce_from_object(%Activity{data: %{"actor" => actor}}, object) do
announcements =
if is_list(object.data["announcements"]), do: object.data["announcements"], else: []

View file

@ -1,27 +1,34 @@
defmodule Pleroma.Web.ActivityPub.ObjectView do
use Pleroma.Web, :view
alias Pleroma.{Object, Activity}
alias Pleroma.Web.ActivityPub.Transmogrifier
def render("object.json", %{object: object}) do
base = %{
"@context" => [
"https://www.w3.org/ns/activitystreams",
"https://w3id.org/security/v1",
%{
"manuallyApprovesFollowers" => "as:manuallyApprovesFollowers",
"sensitive" => "as:sensitive",
"Hashtag" => "as:Hashtag",
"ostatus" => "http://ostatus.org#",
"atomUri" => "ostatus:atomUri",
"inReplyToAtomUri" => "ostatus:inReplyToAtomUri",
"conversation" => "ostatus:conversation",
"toot" => "http://joinmastodon.org/ns#",
"Emoji" => "toot:Emoji"
}
]
}
def render("object.json", %{object: %Object{} = object}) do
base = Pleroma.Web.ActivityPub.Utils.make_json_ld_header()
additional = Transmogrifier.prepare_object(object.data)
Map.merge(base, additional)
end
def render("object.json", %{object: %Activity{data: %{"type" => "Create"}} = activity}) do
base = Pleroma.Web.ActivityPub.Utils.make_json_ld_header()
object = Object.normalize(activity.data["object"])
additional =
Transmogrifier.prepare_object(activity.data)
|> Map.put("object", Transmogrifier.prepare_object(object.data))
Map.merge(base, additional)
end
def render("object.json", %{object: %Activity{} = activity}) do
base = Pleroma.Web.ActivityPub.Utils.make_json_ld_header()
object = Object.normalize(activity.data["object"])
additional =
Transmogrifier.prepare_object(activity.data)
|> Map.put("object", object.data["id"])
Map.merge(base, additional)
end
end

View file

@ -9,6 +9,35 @@ defmodule Pleroma.Web.ActivityPub.UserView do
alias Pleroma.Web.ActivityPub.Utils
import Ecto.Query
# the instance itself is not a Person, but instead an Application
def render("user.json", %{user: %{nickname: nil} = user}) do
{:ok, user} = WebFinger.ensure_keys_present(user)
{:ok, _, public_key} = Salmon.keys_from_pem(user.info["keys"])
public_key = :public_key.pem_entry_encode(:SubjectPublicKeyInfo, public_key)
public_key = :public_key.pem_encode([public_key])
%{
"id" => user.ap_id,
"type" => "Application",
"following" => "#{user.ap_id}/following",
"followers" => "#{user.ap_id}/followers",
"inbox" => "#{user.ap_id}/inbox",
"name" => "Pleroma",
"summary" => "Virtual actor for Pleroma relay",
"url" => user.ap_id,
"manuallyApprovesFollowers" => false,
"publicKey" => %{
"id" => "#{user.ap_id}#main-key",
"owner" => user.ap_id,
"publicKeyPem" => public_key
},
"endpoints" => %{
"sharedInbox" => "#{Pleroma.Web.Endpoint.url()}/inbox"
}
}
|> Map.merge(Utils.make_json_ld_header())
end
def render("user.json", %{user: user}) do
{:ok, user} = WebFinger.ensure_keys_present(user)
{:ok, _, public_key} = Salmon.keys_from_pem(user.info["keys"])
@ -42,7 +71,8 @@ def render("user.json", %{user: user}) do
"image" => %{
"type" => "Image",
"url" => User.banner_url(user)
}
},
"tag" => user.info["source_data"]["tag"] || []
}
|> Map.merge(Utils.make_json_ld_header())
end

View file

@ -0,0 +1,158 @@
defmodule Pleroma.Web.AdminAPI.AdminAPIController do
use Pleroma.Web, :controller
alias Pleroma.{User, Repo}
alias Pleroma.Web.ActivityPub.Relay
require Logger
action_fallback(:errors)
def user_delete(conn, %{"nickname" => nickname}) do
user = User.get_by_nickname(nickname)
if user.local == true do
User.delete(user)
else
User.delete(user)
end
conn
|> json(nickname)
end
def user_create(
conn,
%{"nickname" => nickname, "email" => email, "password" => password}
) do
new_user = %{
nickname: nickname,
name: nickname,
email: email,
password: password,
password_confirmation: password,
bio: "."
}
User.register_changeset(%User{}, new_user)
|> Repo.insert!()
conn
|> json(new_user.nickname)
end
def right_add(conn, %{"permission_group" => permission_group, "nickname" => nickname})
when permission_group in ["moderator", "admin"] do
user = User.get_by_nickname(nickname)
info =
user.info
|> Map.put("is_" <> permission_group, true)
cng = User.info_changeset(user, %{info: info})
{:ok, user} = User.update_and_set_cache(cng)
conn
|> json(user.info)
end
def right_get(conn, %{"nickname" => nickname}) do
user = User.get_by_nickname(nickname)
conn
|> json(user.info)
end
def right_add(conn, _) do
conn
|> put_status(404)
|> json(%{error: "No such permission_group"})
end
def right_delete(
%{assigns: %{user: %User{:nickname => admin_nickname}}} = conn,
%{
"permission_group" => permission_group,
"nickname" => nickname
}
)
when permission_group in ["moderator", "admin"] do
if admin_nickname == nickname do
conn
|> put_status(403)
|> json(%{error: "You can't revoke your own admin status."})
else
user = User.get_by_nickname(nickname)
info =
user.info
|> Map.put("is_" <> permission_group, false)
cng = User.info_changeset(user, %{info: info})
{:ok, user} = User.update_and_set_cache(cng)
conn
|> json(user.info)
end
end
def right_delete(conn, _) do
conn
|> put_status(404)
|> json(%{error: "No such permission_group"})
end
def relay_follow(conn, %{"relay_url" => target}) do
{status, message} = Relay.follow(target)
if status == :ok do
conn
|> json(target)
else
conn
|> put_status(500)
|> json(target)
end
end
def relay_unfollow(conn, %{"relay_url" => target}) do
{status, message} = Relay.unfollow(target)
if status == :ok do
conn
|> json(target)
else
conn
|> put_status(500)
|> json(target)
end
end
@shortdoc "Get a account registeration invite token (base64 string)"
def get_invite_token(conn, _params) do
{:ok, token} = Pleroma.UserInviteToken.create_token()
conn
|> json(token.token)
end
@shortdoc "Get a password reset token (base64 string) for given nickname"
def get_password_reset(conn, %{"nickname" => nickname}) do
(%User{local: true} = user) = User.get_by_nickname(nickname)
{:ok, token} = Pleroma.PasswordResetToken.create_token(user)
conn
|> json(token.token)
end
def errors(conn, {:param_cast, _}) do
conn
|> put_status(400)
|> json("Invalid parameters")
end
def errors(conn, _) do
conn
|> put_status(500)
|> json("Something went wrong")
end
end

View file

@ -4,9 +4,7 @@ defmodule Pleroma.Web.UserSocket do
## Channels
# channel "room:*", Pleroma.Web.RoomChannel
if Application.get_env(:pleroma, :chat) |> Keyword.get(:enabled) do
channel("chat:*", Pleroma.Web.ChatChannel)
end
channel("chat:*", Pleroma.Web.ChatChannel)
## Transports
transport(:websocket, Phoenix.Transports.WebSocket)
@ -24,7 +22,8 @@ defmodule Pleroma.Web.UserSocket do
# See `Phoenix.Token` documentation for examples in
# performing token verification on connect.
def connect(%{"token" => token}, socket) do
with {:ok, user_id} <- Phoenix.Token.verify(socket, "user socket", token, max_age: 84600),
with true <- Pleroma.Config.get([:chat, :enabled]),
{:ok, user_id} <- Phoenix.Token.verify(socket, "user socket", token, max_age: 84600),
%User{} = user <- Pleroma.Repo.get(User, user_id) do
{:ok, assign(socket, :user_name, user.nickname)}
else

View file

@ -1,5 +1,5 @@
defmodule Pleroma.Web.CommonAPI do
alias Pleroma.{Repo, Activity, Object}
alias Pleroma.{User, Repo, Activity, Object}
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Formatter
@ -36,7 +36,6 @@ def unrepeat(id_or_ap_id, user) do
def favorite(id_or_ap_id, user) do
with %Activity{} = activity <- get_by_id_or_ap_id(id_or_ap_id),
false <- activity.data["actor"] == user.ap_id,
object <- Object.normalize(activity.data["object"]["id"]) do
ActivityPub.like(user, object)
else
@ -47,7 +46,6 @@ def favorite(id_or_ap_id, user) do
def unfavorite(id_or_ap_id, user) do
with %Activity{} = activity <- get_by_id_or_ap_id(id_or_ap_id),
false <- activity.data["actor"] == user.ap_id,
object <- Object.normalize(activity.data["object"]["id"]) do
ActivityPub.unlike(user, object)
else
@ -61,28 +59,48 @@ def get_visibility(%{"visibility" => visibility})
do: visibility
def get_visibility(%{"in_reply_to_status_id" => status_id}) when not is_nil(status_id) do
inReplyTo = get_replied_to_activity(status_id)
Pleroma.Web.MastodonAPI.StatusView.get_visibility(inReplyTo.data["object"])
case get_replied_to_activity(status_id) do
nil ->
"public"
inReplyTo ->
Pleroma.Web.MastodonAPI.StatusView.get_visibility(inReplyTo.data["object"])
end
end
def get_visibility(_), do: "public"
@instance Application.get_env(:pleroma, :instance)
@limit Keyword.get(@instance, :limit)
defp get_content_type(content_type) do
if Enum.member?(Pleroma.Config.get([:instance, :allowed_post_formats]), content_type) do
content_type
else
"text/plain"
end
end
def post(user, %{"status" => status} = data) do
visibility = get_visibility(data)
limit = Pleroma.Config.get([:instance, :limit])
with status <- String.trim(status),
length when length in 1..@limit <- String.length(status),
attachments <- attachments_from_ids(data["media_ids"]),
mentions <- Formatter.parse_mentions(status),
inReplyTo <- get_replied_to_activity(data["in_reply_to_status_id"]),
{to, cc} <- to_for_user_and_mentions(user, mentions, inReplyTo, visibility),
tags <- Formatter.parse_tags(status, data),
content_html <-
make_content_html(status, mentions, attachments, tags, data["no_attachment_links"]),
make_content_html(
status,
mentions,
attachments,
tags,
get_content_type(data["content_type"]),
data["no_attachment_links"]
),
context <- make_context(inReplyTo),
cw <- data["spoiler_text"],
full_payload <- String.trim(status <> (data["spoiler_text"] || "")),
length when length in 1..limit <- String.length(full_payload),
object <-
make_note_data(
user.ap_id,
@ -118,6 +136,18 @@ def post(user, %{"status" => status} = data) do
end
def update(user) do
user =
with emoji <- emoji_from_profile(user),
source_data <- (user.info["source_data"] || %{}) |> Map.put("tag", emoji),
new_info <- Map.put(user.info, "source_data", source_data),
change <- User.info_changeset(user, %{info: new_info}),
{:ok, user} <- User.update_and_set_cache(change) do
user
else
_e ->
user
end
ActivityPub.update(%{
local: true,
to: [user.follower_address],

View file

@ -1,6 +1,8 @@
defmodule Pleroma.Web.CommonAPI.Utils do
alias Pleroma.{Repo, Object, Formatter, Activity}
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.Endpoint
alias Pleroma.Web.MediaProxy
alias Pleroma.User
alias Calendar.Strftime
alias Comeonin.Pbkdf2
@ -17,6 +19,8 @@ def get_by_id_or_ap_id(id) do
end
end
def get_replied_to_activity(""), do: nil
def get_replied_to_activity(id) when not is_nil(id) do
Repo.get(Activity, id)
end
@ -30,21 +34,29 @@ def attachments_from_ids(ids) do
end
def to_for_user_and_mentions(user, mentions, inReplyTo, "public") do
to = ["https://www.w3.org/ns/activitystreams#Public"]
mentioned_users = Enum.map(mentions, fn {_, %{ap_id: ap_id}} -> ap_id end)
cc = [user.follower_address | mentioned_users]
to = ["https://www.w3.org/ns/activitystreams#Public" | mentioned_users]
cc = [user.follower_address]
if inReplyTo do
{to, Enum.uniq([inReplyTo.data["actor"] | cc])}
{Enum.uniq([inReplyTo.data["actor"] | to]), cc}
else
{to, cc}
end
end
def to_for_user_and_mentions(user, mentions, inReplyTo, "unlisted") do
{to, cc} = to_for_user_and_mentions(user, mentions, inReplyTo, "public")
{cc, to}
mentioned_users = Enum.map(mentions, fn {_, %{ap_id: ap_id}} -> ap_id end)
to = [user.follower_address | mentioned_users]
cc = ["https://www.w3.org/ns/activitystreams#Public"]
if inReplyTo do
{Enum.uniq([inReplyTo.data["actor"] | to]), cc}
else
{to, cc}
end
end
def to_for_user_and_mentions(user, mentions, inReplyTo, "private") do
@ -62,9 +74,16 @@ def to_for_user_and_mentions(_user, mentions, inReplyTo, "direct") do
end
end
def make_content_html(status, mentions, attachments, tags, no_attachment_links \\ false) do
def make_content_html(
status,
mentions,
attachments,
tags,
content_type,
no_attachment_links \\ false
) do
status
|> format_input(mentions, tags)
|> format_input(mentions, tags, content_type)
|> maybe_add_attachments(attachments, no_attachment_links)
end
@ -80,8 +99,9 @@ def maybe_add_attachments(text, attachments, _no_links) do
def add_attachments(text, attachments) do
attachment_text =
Enum.map(attachments, fn
%{"url" => [%{"href" => href} | _]} ->
name = URI.decode(Path.basename(href))
%{"url" => [%{"href" => href} | _]} = attachment ->
name = attachment["name"] || URI.decode(Path.basename(href))
href = MediaProxy.url(href)
"<a href=\"#{href}\" class='attachment'>#{shortname(name)}</a>"
_ ->
@ -91,9 +111,9 @@ def add_attachments(text, attachments) do
Enum.join([text | attachment_text], "<br>")
end
def format_input(text, mentions, tags) do
def format_input(text, mentions, tags, "text/plain") do
text
|> Formatter.html_escape()
|> Formatter.html_escape("text/plain")
|> String.replace(~r/\r?\n/, "<br>")
|> (&{[], &1}).()
|> Formatter.add_links()
@ -102,6 +122,26 @@ def format_input(text, mentions, tags) do
|> Formatter.finalize()
end
def format_input(text, mentions, tags, "text/html") do
text
|> Formatter.html_escape("text/html")
|> String.replace(~r/\r?\n/, "<br>")
|> (&{[], &1}).()
|> Formatter.add_user_links(mentions)
|> Formatter.finalize()
end
def format_input(text, mentions, tags, "text/markdown") do
text
|> Earmark.as_html!()
|> Formatter.html_escape("text/html")
|> String.replace(~r/\r?\n/, "")
|> (&{[], &1}).()
|> Formatter.add_user_links(mentions)
|> Formatter.add_hashtag_links(tags)
|> Formatter.finalize()
end
def add_tag_links(text, tags) do
tags =
tags
@ -195,4 +235,15 @@ def confirm_current_password(user, password) do
_ -> {:error, "Invalid password."}
end
end
def emoji_from_profile(%{info: info} = user) do
(Formatter.get_emoji(user.bio) ++ Formatter.get_emoji(user.name))
|> Enum.map(fn {shortcode, url} ->
%{
"type" => "Emoji",
"icon" => %{"type" => "Image", "url" => "#{Endpoint.url()}#{url}"},
"name" => ":#{shortcode}:"
}
end)
end
end

View file

@ -1,9 +1,7 @@
defmodule Pleroma.Web.Endpoint do
use Phoenix.Endpoint, otp_app: :pleroma
if Application.get_env(:pleroma, :chat) |> Keyword.get(:enabled) do
socket("/socket", Pleroma.Web.UserSocket)
end
socket("/socket", Pleroma.Web.UserSocket)
socket("/api/v1", Pleroma.Web.MastodonAPI.MastodonSocket)
@ -11,13 +9,17 @@ defmodule Pleroma.Web.Endpoint do
#
# You should set gzip to true if you are running phoenix.digest
# when deploying your static files in production.
plug(Plug.Static, at: "/media", from: Pleroma.Upload.upload_path(), gzip: false)
plug(CORSPlug)
plug(Pleroma.Plugs.HTTPSecurityPlug)
plug(Pleroma.Plugs.UploadedMedia)
plug(
Plug.Static,
at: "/",
from: :pleroma,
only: ~w(index.html static finmoji emoji packs sounds images instance sw.js favicon.png)
only:
~w(index.html static finmoji emoji packs sounds images instance sw.js favicon.png schemas)
)
# Code reloading can be explicitly enabled under the
@ -42,14 +44,23 @@ defmodule Pleroma.Web.Endpoint do
plug(Plug.MethodOverride)
plug(Plug.Head)
cookie_name =
if Application.get_env(:pleroma, Pleroma.Web.Endpoint) |> Keyword.get(:secure_cookie_flag),
do: "__Host-pleroma_key",
else: "pleroma_key"
# The session will be stored in the cookie and signed,
# this means its contents can be read but not tampered with.
# Set :encryption_salt if you would also like to encrypt it.
plug(
Plug.Session,
store: :cookie,
key: "_pleroma_key",
signing_salt: "CqaoopA2"
key: cookie_name,
signing_salt: {Pleroma.Config, :get, [[__MODULE__, :signing_salt], "CqaoopA2"]},
http_only: true,
secure:
Application.get_env(:pleroma, Pleroma.Web.Endpoint) |> Keyword.get(:secure_cookie_flag),
extra: "SameSite=Strict"
)
plug(Pleroma.Web.Router)

View file

@ -3,16 +3,17 @@ defmodule Pleroma.Web.Federator do
alias Pleroma.User
alias Pleroma.Activity
alias Pleroma.Web.{WebFinger, Websub}
alias Pleroma.Web.Federator.RetryQueue
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Relay
alias Pleroma.Web.ActivityPub.Transmogrifier
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.OStatus
require Logger
@websub Application.get_env(:pleroma, :websub)
@ostatus Application.get_env(:pleroma, :ostatus)
@httpoison Application.get_env(:pleroma, :httpoison)
@instance Application.get_env(:pleroma, :instance)
@federating Keyword.get(@instance, :federating)
@max_jobs 20
def init(args) do
@ -64,11 +65,18 @@ def handle(:publish, activity) do
{:ok, actor} = WebFinger.ensure_keys_present(actor)
if ActivityPub.is_public?(activity) do
Logger.info(fn -> "Sending #{activity.data["id"]} out via WebSub" end)
Websub.publish(Pleroma.Web.OStatus.feed_path(actor), actor, activity)
if OStatus.is_representable?(activity) do
Logger.info(fn -> "Sending #{activity.data["id"]} out via WebSub" end)
Websub.publish(Pleroma.Web.OStatus.feed_path(actor), actor, activity)
Logger.info(fn -> "Sending #{activity.data["id"]} out via Salmon" end)
Pleroma.Web.Salmon.publish(actor, activity)
Logger.info(fn -> "Sending #{activity.data["id"]} out via Salmon" end)
Pleroma.Web.Salmon.publish(actor, activity)
end
if Keyword.get(Application.get_env(:pleroma, :instance), :allow_relay) do
Logger.info(fn -> "Relaying #{activity.data["id"]} out" end)
Relay.publish(activity)
end
end
Logger.info(fn -> "Sending #{activity.data["id"]} out via AP" end)
@ -94,44 +102,46 @@ def handle(:incoming_ap_doc, params) do
params = Utils.normalize_params(params)
# NOTE: we use the actor ID to do the containment, this is fine because an
# actor shouldn't be acting on objects outside their own AP server.
with {:ok, _user} <- ap_enabled_actor(params["actor"]),
nil <- Activity.normalize(params["id"]),
{:ok, _activity} <- Transmogrifier.handle_incoming(params) do
:ok <- Transmogrifier.contain_origin_from_id(params["actor"], params),
{:ok, activity} <- Transmogrifier.handle_incoming(params) do
{:ok, activity}
else
%Activity{} ->
Logger.info("Already had #{params["id"]}")
:error
_e ->
# Just drop those for now
Logger.info("Unhandled activity")
Logger.info(Poison.encode!(params, pretty: 2))
:error
end
end
def handle(:publish_single_ap, params) do
ActivityPub.publish_one(params)
case ActivityPub.publish_one(params) do
{:ok, _} ->
:ok
{:error, _} ->
RetryQueue.enqueue(params, ActivityPub)
end
end
def handle(:publish_single_websub, %{xml: xml, topic: topic, callback: callback, secret: secret}) do
signature = @websub.sign(secret || "", xml)
Logger.debug(fn -> "Pushing #{topic} to #{callback}" end)
def handle(
:publish_single_websub,
%{xml: xml, topic: topic, callback: callback, secret: secret} = params
) do
case Websub.publish_one(params) do
{:ok, _} ->
:ok
with {:ok, %{status_code: code}} <-
@httpoison.post(
callback,
xml,
[
{"Content-Type", "application/atom+xml"},
{"X-Hub-Signature", "sha1=#{signature}"}
],
timeout: 10000,
recv_timeout: 20000,
hackney: [pool: :default]
) do
Logger.debug(fn -> "Pushed to #{callback}, code #{code}" end)
else
e ->
Logger.debug(fn -> "Couldn't push to #{callback}, #{inspect(e)}" end)
{:error, _} ->
RetryQueue.enqueue(params, Websub)
end
end
@ -140,11 +150,15 @@ def handle(type, _) do
{:error, "Don't know what to do with this"}
end
def enqueue(type, payload, priority \\ 1) do
if @federating do
if Mix.env() == :test do
if Mix.env() == :test do
def enqueue(type, payload, priority \\ 1) do
if Pleroma.Config.get([:instance, :federating]) do
handle(type, payload)
else
end
end
else
def enqueue(type, payload, priority \\ 1) do
if Pleroma.Config.get([:instance, :federating]) do
GenServer.cast(__MODULE__, {:enqueue, type, payload, priority})
end
end

View file

@ -0,0 +1,71 @@
defmodule Pleroma.Web.Federator.RetryQueue do
use GenServer
alias Pleroma.Web.{WebFinger, Websub}
alias Pleroma.Web.ActivityPub.ActivityPub
require Logger
@websub Application.get_env(:pleroma, :websub)
@ostatus Application.get_env(:pleroma, :websub)
@httpoison Application.get_env(:pleroma, :websub)
@instance Application.get_env(:pleroma, :websub)
# initial timeout, 5 min
@initial_timeout 30_000
@max_retries 5
def init(args) do
{:ok, args}
end
def start_link() do
GenServer.start_link(__MODULE__, %{delivered: 0, dropped: 0}, name: __MODULE__)
end
def enqueue(data, transport, retries \\ 0) do
GenServer.cast(__MODULE__, {:maybe_enqueue, data, transport, retries + 1})
end
def get_retry_params(retries) do
if retries > @max_retries do
{:drop, "Max retries reached"}
else
{:retry, growth_function(retries)}
end
end
def handle_cast({:maybe_enqueue, data, transport, retries}, %{dropped: drop_count} = state) do
case get_retry_params(retries) do
{:retry, timeout} ->
Process.send_after(
__MODULE__,
{:send, data, transport, retries},
growth_function(retries)
)
{:noreply, state}
{:drop, message} ->
Logger.debug(message)
{:noreply, %{state | dropped: drop_count + 1}}
end
end
def handle_info({:send, data, transport, retries}, %{delivered: delivery_count} = state) do
case transport.publish_one(data) do
{:ok, _} ->
{:noreply, %{state | delivered: delivery_count + 1}}
{:error, reason} ->
enqueue(data, transport, retries)
{:noreply, state}
end
end
def handle_info(unknown, state) do
Logger.debug("RetryQueue: don't know what to do with #{inspect(unknown)}, ignoring")
{:noreply, state}
end
defp growth_function(retries) do
round(@initial_timeout * :math.pow(retries, 3))
end
end

View file

@ -2,11 +2,12 @@ defmodule Pleroma.Web.MastodonAPI.MastodonAPIController do
use Pleroma.Web, :controller
alias Pleroma.{Repo, Object, Activity, User, Notification, Stats}
alias Pleroma.Web
alias Pleroma.Web.MastodonAPI.{StatusView, AccountView, MastodonView, ListView}
alias Pleroma.Web.MastodonAPI.{StatusView, AccountView, MastodonView, ListView, FilterView}
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Utils
alias Pleroma.Web.CommonAPI
alias Pleroma.Web.OAuth.{Authorization, Token, App}
alias Pleroma.Web.MediaProxy
alias Comeonin.Pbkdf2
import Ecto.Query
require Logger
@ -51,7 +52,7 @@ def update_credentials(%{assigns: %{user: user}} = conn, params) do
user =
if avatar = params["avatar"] do
with %Plug.Upload{} <- avatar,
{:ok, object} <- ActivityPub.upload(avatar),
{:ok, object} <- ActivityPub.upload(avatar, type: :avatar),
change = Ecto.Changeset.change(user, %{avatar: object.data}),
{:ok, user} = User.update_and_set_cache(change) do
user
@ -65,7 +66,7 @@ def update_credentials(%{assigns: %{user: user}} = conn, params) do
user =
if banner = params["header"] do
with %Plug.Upload{} <- banner,
{:ok, object} <- ActivityPub.upload(banner),
{:ok, object} <- ActivityPub.upload(banner, type: :banner),
new_info <- Map.put(user.info, "banner", object.data),
change <- User.info_changeset(user, %{info: new_info}),
{:ok, user} <- User.update_and_set_cache(change) do
@ -97,7 +98,7 @@ def update_credentials(%{assigns: %{user: user}} = conn, params) do
CommonAPI.update(user)
end
json(conn, AccountView.render("account.json", %{user: user}))
json(conn, AccountView.render("account.json", %{user: user, for: user}))
else
_e ->
conn
@ -107,13 +108,13 @@ def update_credentials(%{assigns: %{user: user}} = conn, params) do
end
def verify_credentials(%{assigns: %{user: user}} = conn, _) do
account = AccountView.render("account.json", %{user: user})
account = AccountView.render("account.json", %{user: user, for: user})
json(conn, account)
end
def user(conn, %{"id" => id}) do
def user(%{assigns: %{user: for_user}} = conn, %{"id" => id}) do
with %User{} = user <- Repo.get(User, id) do
account = AccountView.render("account.json", %{user: user})
account = AccountView.render("account.json", %{user: user, for: for_user})
json(conn, account)
else
_e ->
@ -123,22 +124,23 @@ def user(conn, %{"id" => id}) do
end
end
@instance Application.get_env(:pleroma, :instance)
@mastodon_api_level "2.3.3"
@mastodon_api_level "2.5.0"
def masto_instance(conn, _params) do
instance = Pleroma.Config.get(:instance)
response = %{
uri: Web.base_url(),
title: Keyword.get(@instance, :name),
description: Keyword.get(@instance, :description),
version: "#{@mastodon_api_level} (compatible; #{Keyword.get(@instance, :version)})",
email: Keyword.get(@instance, :email),
title: Keyword.get(instance, :name),
description: Keyword.get(instance, :description),
version: "#{@mastodon_api_level} (compatible; #{Pleroma.Application.named_version()})",
email: Keyword.get(instance, :email),
urls: %{
streaming_api: String.replace(Pleroma.Web.Endpoint.static_url(), "http", "ws")
},
stats: Stats.get_stats(),
thumbnail: Web.base_url() <> "/instance/thumbnail.jpeg",
max_toot_chars: Keyword.get(@instance, :limit)
max_toot_chars: Keyword.get(instance, :limit)
}
json(conn, response)
@ -149,7 +151,7 @@ def peers(conn, _params) do
end
defp mastodonized_emoji do
Pleroma.Formatter.get_custom_emoji()
Pleroma.Emoji.get_all()
|> Enum.map(fn {shortcode, relative_url} ->
url = to_string(URI.merge(Web.base_url(), relative_url))
@ -222,6 +224,7 @@ def home_timeline(%{assigns: %{user: user}} = conn, params) do
activities =
ActivityPub.fetch_activities([user.ap_id | user.following], params)
|> ActivityPub.contain_timeline(user)
|> Enum.reverse()
conn
@ -267,9 +270,12 @@ def user_statuses(%{assigns: %{user: reading_user}} = conn, params) do
end
end
def dm_timeline(%{assigns: %{user: user}} = conn, _params) do
def dm_timeline(%{assigns: %{user: user}} = conn, params) do
query =
ActivityPub.fetch_activities_query([user.ap_id], %{"type" => "Create", visibility: "direct"})
ActivityPub.fetch_activities_query(
[user.ap_id],
Map.merge(params, %{"type" => "Create", visibility: "direct"})
)
activities = Repo.all(query)
@ -281,7 +287,7 @@ def dm_timeline(%{assigns: %{user: user}} = conn, _params) do
def get_status(%{assigns: %{user: user}} = conn, %{"id" => id}) do
with %Activity{} = activity <- Repo.get(Activity, id),
true <- ActivityPub.visible_for_user?(activity, user) do
render(conn, StatusView, "status.json", %{activity: activity, for: user})
try_render(conn, StatusView, "status.json", %{activity: activity, for: user})
end
end
@ -344,7 +350,7 @@ def post_status(%{assigns: %{user: user}} = conn, %{"status" => _} = params) do
{:ok, activity} =
Cachex.fetch!(:idempotency_cache, idempotency_key, fn _ -> CommonAPI.post(user, params) end)
render(conn, StatusView, "status.json", %{activity: activity, for: user, as: :activity})
try_render(conn, StatusView, "status.json", %{activity: activity, for: user, as: :activity})
end
def delete_status(%{assigns: %{user: user}} = conn, %{"id" => id}) do
@ -360,28 +366,28 @@ def delete_status(%{assigns: %{user: user}} = conn, %{"id" => id}) do
def reblog_status(%{assigns: %{user: user}} = conn, %{"id" => ap_id_or_id}) do
with {:ok, announce, _activity} <- CommonAPI.repeat(ap_id_or_id, user) do
render(conn, StatusView, "status.json", %{activity: announce, for: user, as: :activity})
try_render(conn, StatusView, "status.json", %{activity: announce, for: user, as: :activity})
end
end
def unreblog_status(%{assigns: %{user: user}} = conn, %{"id" => ap_id_or_id}) do
with {:ok, _unannounce, %{data: %{"id" => id}}} <- CommonAPI.unrepeat(ap_id_or_id, user),
%Activity{} = activity <- Activity.get_create_activity_by_object_ap_id(id) do
render(conn, StatusView, "status.json", %{activity: activity, for: user, as: :activity})
try_render(conn, StatusView, "status.json", %{activity: activity, for: user, as: :activity})
end
end
def fav_status(%{assigns: %{user: user}} = conn, %{"id" => ap_id_or_id}) do
with {:ok, _fav, %{data: %{"id" => id}}} <- CommonAPI.favorite(ap_id_or_id, user),
%Activity{} = activity <- Activity.get_create_activity_by_object_ap_id(id) do
render(conn, StatusView, "status.json", %{activity: activity, for: user, as: :activity})
try_render(conn, StatusView, "status.json", %{activity: activity, for: user, as: :activity})
end
end
def unfav_status(%{assigns: %{user: user}} = conn, %{"id" => ap_id_or_id}) do
with {:ok, _, _, %{data: %{"id" => id}}} <- CommonAPI.unfavorite(ap_id_or_id, user),
%Activity{} = activity <- Activity.get_create_activity_by_object_ap_id(id) do
render(conn, StatusView, "status.json", %{activity: activity, for: user, as: :activity})
try_render(conn, StatusView, "status.json", %{activity: activity, for: user, as: :activity})
end
end
@ -433,6 +439,12 @@ def relationships(%{assigns: %{user: user}} = conn, %{"id" => id}) do
render(conn, AccountView, "relationships.json", %{user: user, targets: targets})
end
# Instead of returning a 400 when no "id" params is present, Mastodon returns an empty array.
def relationships(%{assigns: %{user: user}} = conn, _) do
conn
|> json([])
end
def update_media(%{assigns: %{user: _}} = conn, data) do
with %Object{} = object <- Repo.get(Object, data["id"]),
true <- is_binary(data["description"]),
@ -440,7 +452,7 @@ def update_media(%{assigns: %{user: _}} = conn, data) do
new_data = %{object.data | "name" => description}
change = Object.change(object, %{data: new_data})
{:ok, media_obj} = Repo.update(change)
{:ok, _} = Repo.update(change)
data =
new_data
@ -451,19 +463,12 @@ def update_media(%{assigns: %{user: _}} = conn, data) do
end
def upload(%{assigns: %{user: _}} = conn, %{"file" => file} = data) do
with {:ok, object} <- ActivityPub.upload(file) do
objdata =
if Map.has_key?(data, "description") do
Map.put(object.data, "name", data["description"])
else
object.data
end
change = Object.change(object, %{data: objdata})
with {:ok, object} <- ActivityPub.upload(file, description: Map.get(data, "description")) do
change = Object.change(object, %{data: object.data})
{:ok, object} = Repo.update(change)
objdata =
objdata
object.data
|> Map.put("id", object.id)
render(conn, StatusView, "attachment.json", %{attachment: objdata})
@ -498,6 +503,7 @@ def hashtag_timeline(%{assigns: %{user: user}} = conn, params) do
|> Map.put("type", "Create")
|> Map.put("local_only", local_only)
|> Map.put("blocking_user", user)
|> Map.put("tag", String.downcase(params["tag"]))
activities =
ActivityPub.fetch_public_activities(params)
@ -573,7 +579,13 @@ def reject_follow_request(%{assigns: %{user: followed}} = conn, %{"id" => id}) d
def follow(%{assigns: %{user: follower}} = conn, %{"id" => id}) do
with %User{} = followed <- Repo.get(User, id),
{:ok, follower} <- User.maybe_direct_follow(follower, followed),
{:ok, _activity} <- ActivityPub.follow(follower, followed) do
{:ok, _activity} <- ActivityPub.follow(follower, followed),
{:ok, follower, followed} <-
User.wait_and_refresh(
Pleroma.Config.get([:activitypub, :follow_handshake_timeout]),
follower,
followed
) do
render(conn, AccountView, "relationship.json", %{user: follower, target: followed})
else
{:error, message} ->
@ -587,7 +599,7 @@ def follow(%{assigns: %{user: follower}} = conn, %{"uri" => uri}) do
with %User{} = followed <- Repo.get_by(User, nickname: uri),
{:ok, follower} <- User.maybe_direct_follow(follower, followed),
{:ok, _activity} <- ActivityPub.follow(follower, followed) do
render(conn, AccountView, "account.json", %{user: followed})
render(conn, AccountView, "account.json", %{user: followed, for: follower})
else
{:error, message} ->
conn
@ -653,9 +665,7 @@ def unblock_domain(%{assigns: %{user: blocker}} = conn, %{"domain" => domain}) d
json(conn, %{})
end
def search2(%{assigns: %{user: user}} = conn, %{"q" => query} = params) do
accounts = User.search(query, params["resolve"] == "true")
def status_search(query) do
fetched =
if Regex.match?(~r/https?:/, query) do
with {:ok, object} <- ActivityPub.fetch_object_from_id(query) do
@ -680,7 +690,13 @@ def search2(%{assigns: %{user: user}} = conn, %{"q" => query} = params) do
order_by: [desc: :id]
)
statuses = Repo.all(q) ++ fetched
Repo.all(q) ++ fetched
end
def search2(%{assigns: %{user: user}} = conn, %{"q" => query} = params) do
accounts = User.search(query, params["resolve"] == "true")
statuses = status_search(query)
tags_path = Web.base_url() <> "/tag/"
@ -704,31 +720,7 @@ def search2(%{assigns: %{user: user}} = conn, %{"q" => query} = params) do
def search(%{assigns: %{user: user}} = conn, %{"q" => query} = params) do
accounts = User.search(query, params["resolve"] == "true")
fetched =
if Regex.match?(~r/https?:/, query) do
with {:ok, object} <- ActivityPub.fetch_object_from_id(query) do
[Activity.get_create_activity_by_object_ap_id(object.data["id"])]
else
_e -> []
end
end || []
q =
from(
a in Activity,
where: fragment("?->>'type' = 'Create'", a.data),
where: "https://www.w3.org/ns/activitystreams#Public" in a.recipients,
where:
fragment(
"to_tsvector('english', ?->'object'->>'content') @@ plainto_tsquery('english', ?)",
a.data,
^query
),
limit: 20,
order_by: [desc: :id]
)
statuses = Repo.all(q) ++ fetched
statuses = status_search(query)
tags =
String.split(query)
@ -784,6 +776,12 @@ def get_list(%{assigns: %{user: user}} = conn, %{"id" => id}) do
end
end
def account_lists(%{assigns: %{user: user}} = conn, %{"id" => account_id}) do
lists = Pleroma.List.get_lists_account_belongs(user, account_id)
res = ListView.render("lists.json", lists: lists)
json(conn, res)
end
def delete_list(%{assigns: %{user: user}} = conn, %{"id" => id}) do
with %Pleroma.List{} = list <- Pleroma.List.get(id, user),
{:ok, _list} <- Pleroma.List.delete(list) do
@ -850,9 +848,14 @@ def list_timeline(%{assigns: %{user: user}} = conn, %{"list_id" => id} = params)
|> Map.put("type", "Create")
|> Map.put("blocking_user", user)
# adding title is a hack to not make empty lists function like a public timeline
# we must filter the following list for the user to avoid leaking statuses the user
# does not actually have permission to see (for more info, peruse security issue #270).
following_to =
following
|> Enum.filter(fn x -> x in user.following end)
activities =
ActivityPub.fetch_activities([title | following], params)
ActivityPub.fetch_activities_bounded(following_to, following, params)
|> Enum.reverse()
conn
@ -872,7 +875,11 @@ def index(%{assigns: %{user: user}} = conn, _params) do
if user && token do
mastodon_emoji = mastodonized_emoji()
accounts = Map.put(%{}, user.id, AccountView.render("account.json", %{user: user}))
limit = Pleroma.Config.get([:instance, :limit])
accounts =
Map.put(%{}, user.id, AccountView.render("account.json", %{user: user, for: user}))
initial_state =
%{
@ -890,7 +897,7 @@ def index(%{assigns: %{user: user}} = conn, _params) do
auto_play_gif: false,
display_sensitive_media: false,
reduce_motion: false,
max_toot_chars: Keyword.get(@instance, :limit)
max_toot_chars: limit
},
rights: %{
delete_others_notice: !!user.info["is_moderator"]
@ -950,7 +957,7 @@ def index(%{assigns: %{user: user}} = conn, _params) do
push_subscription: nil,
accounts: accounts,
custom_emojis: mastodon_emoji,
char_limit: Keyword.get(@instance, :limit)
char_limit: limit
}
|> Jason.encode!()
@ -976,9 +983,29 @@ def put_settings(%{assigns: %{user: user}} = conn, %{"data" => settings} = _para
end
end
def login(conn, %{"code" => code}) do
with {:ok, app} <- get_or_make_app(),
%Authorization{} = auth <- Repo.get_by(Authorization, token: code, app_id: app.id),
{:ok, token} <- Token.exchange_token(app, auth) do
conn
|> put_session(:oauth_token, token.token)
|> redirect(to: "/web/getting-started")
end
end
def login(conn, _) do
conn
|> render(MastodonView, "login.html", %{error: false})
with {:ok, app} <- get_or_make_app() do
path =
o_auth_path(conn, :authorize,
response_type: "code",
client_id: app.client_id,
redirect_uri: ".",
scope: app.scopes
)
conn
|> redirect(to: path)
end
end
defp get_or_make_app() do
@ -997,22 +1024,6 @@ defp get_or_make_app() do
end
end
def login_post(conn, %{"authorization" => %{"name" => name, "password" => password}}) do
with %User{} = user <- User.get_by_nickname_or_email(name),
true <- Pbkdf2.checkpw(password, user.password_hash),
{:ok, app} <- get_or_make_app(),
{:ok, auth} <- Authorization.create_authorization(app, user),
{:ok, token} <- Token.exchange_token(app, auth) do
conn
|> put_session(:oauth_token, token.token)
|> redirect(to: "/web/getting-started")
else
_e ->
conn
|> render(MastodonView, "login.html", %{error: "Wrong username or password"})
end
end
def logout(conn, _) do
conn
|> clear_session
@ -1044,13 +1055,15 @@ def render_notification(user, %{id: id, activity: activity, inserted_at: created
NaiveDateTime.to_iso8601(created_at)
|> String.replace(~r/(\.\d+)?$/, ".000Z", global: false)
id = id |> to_string
case activity.data["type"] do
"Create" ->
%{
id: id,
type: "mention",
created_at: created_at,
account: AccountView.render("account.json", %{user: actor}),
account: AccountView.render("account.json", %{user: actor, for: user}),
status: StatusView.render("status.json", %{activity: activity, for: user})
}
@ -1061,7 +1074,7 @@ def render_notification(user, %{id: id, activity: activity, inserted_at: created
id: id,
type: "favourite",
created_at: created_at,
account: AccountView.render("account.json", %{user: actor}),
account: AccountView.render("account.json", %{user: actor, for: user}),
status: StatusView.render("status.json", %{activity: liked_activity, for: user})
}
@ -1072,7 +1085,7 @@ def render_notification(user, %{id: id, activity: activity, inserted_at: created
id: id,
type: "reblog",
created_at: created_at,
account: AccountView.render("account.json", %{user: actor}),
account: AccountView.render("account.json", %{user: actor, for: user}),
status: StatusView.render("status.json", %{activity: announced_activity, for: user})
}
@ -1081,7 +1094,7 @@ def render_notification(user, %{id: id, activity: activity, inserted_at: created
id: id,
type: "follow",
created_at: created_at,
account: AccountView.render("account.json", %{user: actor})
account: AccountView.render("account.json", %{user: actor, for: user})
}
_ ->
@ -1089,23 +1102,80 @@ def render_notification(user, %{id: id, activity: activity, inserted_at: created
end
end
def get_filters(%{assigns: %{user: user}} = conn, _) do
filters = Pleroma.Filter.get_filters(user)
res = FilterView.render("filters.json", filters: filters)
json(conn, res)
end
def create_filter(
%{assigns: %{user: user}} = conn,
%{"phrase" => phrase, "context" => context} = params
) do
query = %Pleroma.Filter{
user_id: user.id,
phrase: phrase,
context: context,
hide: Map.get(params, "irreversible", nil),
whole_word: Map.get(params, "boolean", true)
# expires_at
}
{:ok, response} = Pleroma.Filter.create(query)
res = FilterView.render("filter.json", filter: response)
json(conn, res)
end
def get_filter(%{assigns: %{user: user}} = conn, %{"id" => filter_id}) do
filter = Pleroma.Filter.get(filter_id, user)
res = FilterView.render("filter.json", filter: filter)
json(conn, res)
end
def update_filter(
%{assigns: %{user: user}} = conn,
%{"phrase" => phrase, "context" => context, "id" => filter_id} = params
) do
query = %Pleroma.Filter{
user_id: user.id,
filter_id: filter_id,
phrase: phrase,
context: context,
hide: Map.get(params, "irreversible", nil),
whole_word: Map.get(params, "boolean", true)
# expires_at
}
{:ok, response} = Pleroma.Filter.update(query)
res = FilterView.render("filter.json", filter: response)
json(conn, res)
end
def delete_filter(%{assigns: %{user: user}} = conn, %{"id" => filter_id}) do
query = %Pleroma.Filter{
user_id: user.id,
filter_id: filter_id
}
{:ok, _} = Pleroma.Filter.delete(query)
json(conn, %{})
end
def errors(conn, _) do
conn
|> put_status(500)
|> json("Something went wrong")
end
@suggestions Application.get_env(:pleroma, :suggestions)
def suggestions(%{assigns: %{user: user}} = conn, _) do
if Keyword.get(@suggestions, :enabled, false) do
api = Keyword.get(@suggestions, :third_party_engine, "")
timeout = Keyword.get(@suggestions, :timeout, 5000)
suggestions = Pleroma.Config.get(:suggestions)
host =
Application.get_env(:pleroma, Pleroma.Web.Endpoint)
|> Keyword.get(:url)
|> Keyword.get(:host)
if Keyword.get(suggestions, :enabled, false) do
api = Keyword.get(suggestions, :third_party_engine, "")
timeout = Keyword.get(suggestions, :timeout, 5000)
limit = Keyword.get(suggestions, :limit, 23)
host = Pleroma.Config.get([Pleroma.Web.Endpoint, :url, :host])
user = user.nickname
url = String.replace(api, "{{host}}", host) |> String.replace("{{user}}", user)
@ -1114,9 +1184,22 @@ def suggestions(%{assigns: %{user: user}} = conn, _) do
@httpoison.get(url, [], timeout: timeout, recv_timeout: timeout),
{:ok, data} <- Jason.decode(body) do
data2 =
Enum.slice(data, 0, 40)
Enum.slice(data, 0, limit)
|> Enum.map(fn x ->
Map.put(x, "id", User.get_or_fetch(x["acct"]).id)
Map.put(
x,
"id",
case User.get_or_fetch(x["acct"]) do
%{id: id} -> id
_ -> 0
end
)
end)
|> Enum.map(fn x ->
Map.put(x, "avatar", MediaProxy.url(x["avatar"]))
end)
|> Enum.map(fn x ->
Map.put(x, "avatar_static", MediaProxy.url(x["avatar_static"]))
end)
conn
@ -1128,4 +1211,23 @@ def suggestions(%{assigns: %{user: user}} = conn, _) do
json(conn, [])
end
end
def try_render(conn, renderer, target, params)
when is_binary(target) do
res = render(conn, renderer, target, params)
if res == nil do
conn
|> put_status(501)
|> json(%{error: "Can't display this activity"})
else
res
end
end
def try_render(conn, _, _, _) do
conn
|> put_status(501)
|> json(%{error: "Can't display this activity"})
end
end

View file

@ -11,9 +11,8 @@ defmodule Pleroma.Web.MastodonAPI.MastodonSocket do
timeout: :infinity
)
def connect(params, socket) do
with token when not is_nil(token) <- params["access_token"],
%Token{user_id: user_id} <- Repo.get_by(Token, token: token),
def connect(%{"access_token" => token} = params, socket) do
with %Token{user_id: user_id} <- Repo.get_by(Token, token: token),
%User{} = user <- Repo.get(User, user_id),
stream
when stream in [
@ -23,16 +22,40 @@ def connect(params, socket) do
"public:local:media",
"user",
"direct",
"list"
"list",
"hashtag"
] <- params["stream"] do
topic = if stream == "list", do: "list:#{params["list"]}", else: stream
topic =
case stream do
"hashtag" -> "hashtag:#{params["tag"]}"
"list" -> "list:#{params["list"]}"
_ -> stream
end
socket =
socket
|> assign(:topic, topic)
|> assign(:user, user)
Pleroma.Web.Streamer.add_socket(params["stream"], socket)
Pleroma.Web.Streamer.add_socket(topic, socket)
{:ok, socket}
else
_e -> :error
end
end
def connect(%{"stream" => stream} = params, socket)
when stream in ["public", "public:local", "hashtag"] do
topic =
case stream do
"hashtag" -> "hashtag:#{params["tag"]}"
_ -> stream
end
with socket =
socket
|> assign(:topic, topic) do
Pleroma.Web.Streamer.add_socket(topic, socket)
{:ok, socket}
else
_e -> :error

View file

@ -4,15 +4,17 @@ defmodule Pleroma.Web.MastodonAPI.AccountView do
alias Pleroma.Web.MastodonAPI.AccountView
alias Pleroma.Web.CommonAPI.Utils
alias Pleroma.Web.MediaProxy
alias Pleroma.HTML
def render("accounts.json", %{users: users} = opts) do
render_many(users, AccountView, "account.json", opts)
end
def render("account.json", %{user: user}) do
def render("account.json", %{user: user} = opts) do
image = User.avatar_url(user) |> MediaProxy.url()
header = User.banner_url(user) |> MediaProxy.url()
user_info = User.user_info(user)
bot = (user.info["source_data"]["type"] || "Person") in ["Application", "Service"]
emojis =
(user.info["source_data"]["tag"] || [])
@ -26,9 +28,16 @@ def render("account.json", %{user: user}) do
}
end)
fields =
(user.info["source_data"]["attachment"] || [])
|> Enum.filter(fn %{"type" => t} -> t == "PropertyValue" end)
|> Enum.map(fn fields -> Map.take(fields, ["name", "value"]) end)
bio = HTML.filter_tags(user.bio, User.html_filter_policy(opts[:for]))
%{
id: to_string(user.id),
username: hd(String.split(user.nickname, "@")),
username: username_from_nickname(user.nickname),
acct: user.nickname,
display_name: user.name || user.nickname,
locked: user_info.locked,
@ -36,18 +45,19 @@ def render("account.json", %{user: user}) do
followers_count: user_info.follower_count,
following_count: user_info.following_count,
statuses_count: user_info.note_count,
note: HtmlSanitizeEx.basic_html(user.bio) || "",
note: bio || "",
url: user.ap_id,
avatar: image,
avatar_static: image,
header: header,
header_static: header,
emojis: emojis,
fields: [],
fields: fields,
bot: bot,
source: %{
note: "",
privacy: "public",
sensitive: "false"
privacy: user_info.default_scope,
sensitive: false
}
}
end
@ -56,24 +66,42 @@ def render("mention.json", %{user: user}) do
%{
id: to_string(user.id),
acct: user.nickname,
username: hd(String.split(user.nickname, "@")),
username: username_from_nickname(user.nickname),
url: user.ap_id
}
end
def render("relationship.json", %{user: user, target: target}) do
follow_activity = Pleroma.Web.ActivityPub.Utils.fetch_latest_follow(user, target)
requested =
if follow_activity do
follow_activity.data["state"] == "pending"
else
false
end
%{
id: to_string(target.id),
following: User.following?(user, target),
followed_by: User.following?(target, user),
blocking: User.blocks?(user, target),
muting: false,
requested: false,
domain_blocking: false
muting_notifications: false,
requested: requested,
domain_blocking: false,
showing_reblogs: false,
endorsed: false
}
end
def render("relationships.json", %{user: user, targets: targets}) do
render_many(targets, AccountView, "relationship.json", user: user, as: :target)
end
defp username_from_nickname(string) when is_binary(string) do
hd(String.split(string, "@"))
end
defp username_from_nickname(_), do: nil
end

View file

@ -0,0 +1,27 @@
defmodule Pleroma.Web.MastodonAPI.FilterView do
use Pleroma.Web, :view
alias Pleroma.Web.MastodonAPI.FilterView
alias Pleroma.Web.CommonAPI.Utils
def render("filters.json", %{filters: filters} = opts) do
render_many(filters, FilterView, "filter.json", opts)
end
def render("filter.json", %{filter: filter}) do
expires_at =
if filter.expires_at do
Utils.to_masto_date(filter.expires_at)
else
nil
end
%{
id: to_string(filter.filter_id),
phrase: filter.phrase,
context: filter.context,
expires_at: expires_at,
irreversible: filter.hide,
whole_word: false
}
end
end

View file

@ -5,6 +5,7 @@ defmodule Pleroma.Web.MastodonAPI.StatusView do
alias Pleroma.Web.CommonAPI.Utils
alias Pleroma.Web.MediaProxy
alias Pleroma.Repo
alias Pleroma.HTML
# TODO: Add cached version.
defp get_replied_to_activities(activities) do
@ -33,6 +34,7 @@ def render("index.json", opts) do
"status.json",
Map.put(opts, :replied_to_activities, replied_to_activities)
)
|> Enum.filter(fn x -> not is_nil(x) end)
end
def render(
@ -59,9 +61,10 @@ def render(
in_reply_to_id: nil,
in_reply_to_account_id: nil,
reblog: reblogged,
content: reblogged[:content],
content: reblogged[:content] || "",
created_at: created_at,
reblogs_count: 0,
replies_count: 0,
favourites_count: 0,
reblogged: false,
favourited: false,
@ -111,15 +114,19 @@ def render("status.json", %{activity: %{data: %{"object" => object}} = activity}
emojis =
(activity.data["object"]["emoji"] || [])
|> Enum.map(fn {name, url} ->
name = HtmlSanitizeEx.strip_tags(name)
name = HTML.strip_tags(name)
url =
HtmlSanitizeEx.strip_tags(url)
HTML.strip_tags(url)
|> MediaProxy.url()
%{shortcode: name, url: url, static_url: url}
%{shortcode: name, url: url, static_url: url, visible_in_picker: false}
end)
content =
render_content(object)
|> HTML.filter_tags(User.html_filter_policy(opts[:for]))
%{
id: to_string(activity.id),
uri: object["id"],
@ -128,9 +135,10 @@ def render("status.json", %{activity: %{data: %{"object" => object}} = activity}
in_reply_to_id: reply_to && to_string(reply_to.id),
in_reply_to_account_id: reply_to_user && to_string(reply_to_user.id),
reblog: nil,
content: render_content(object),
content: content,
created_at: created_at,
reblogs_count: announcement_count,
replies_count: 0,
favourites_count: like_count,
reblogged: !!repeated,
favourited: !!favorited,
@ -151,10 +159,14 @@ def render("status.json", %{activity: %{data: %{"object" => object}} = activity}
}
end
def render("status.json", _) do
nil
end
def render("attachment.json", %{attachment: attachment}) do
[attachment_url | _] = attachment["url"]
media_type = attachment_url["mediaType"] || attachment_url["mimeType"]
href = attachment_url["href"]
media_type = attachment_url["mediaType"] || attachment_url["mimeType"] || "image"
href = attachment_url["href"] |> MediaProxy.url()
type =
cond do
@ -168,9 +180,9 @@ def render("attachment.json", %{attachment: attachment}) do
%{
id: to_string(attachment["id"] || hash_id),
url: MediaProxy.url(href),
url: href,
remote_url: href,
preview_url: MediaProxy.url(href),
preview_url: href,
text_url: href,
type: type,
description: attachment["name"]
@ -218,26 +230,24 @@ def render_content(%{"type" => "Video"} = object) do
if !!name and name != "" do
"<p><a href=\"#{object["id"]}\">#{name}</a></p>#{object["content"]}"
else
object["content"]
object["content"] || ""
end
HtmlSanitizeEx.basic_html(content)
content
end
def render_content(%{"type" => "Article"} = object) do
def render_content(%{"type" => object_type} = object) when object_type in ["Article", "Page"] do
summary = object["name"]
content =
if !!summary and summary != "" do
if !!summary and summary != "" and is_bitstring(object["url"]) do
"<p><a href=\"#{object["url"]}\">#{summary}</a></p>#{object["content"]}"
else
object["content"]
object["content"] || ""
end
HtmlSanitizeEx.basic_html(content)
content
end
def render_content(object) do
HtmlSanitizeEx.basic_html(object["content"])
end
def render_content(object), do: object["content"] || ""
end

View file

@ -1,97 +1,34 @@
defmodule Pleroma.Web.MediaProxy.MediaProxyController do
use Pleroma.Web, :controller
require Logger
alias Pleroma.{Web.MediaProxy, ReverseProxy}
@httpoison Application.get_env(:pleroma, :httpoison)
@default_proxy_opts [max_body_length: 25 * 1_048_576]
@max_body_length 25 * 1_048_576
@cache_control %{
default: "public, max-age=1209600",
error: "public, must-revalidate, max-age=160"
}
def remote(conn, %{"sig" => sig, "url" => url}) do
config = Application.get_env(:pleroma, :media_proxy, [])
with true <- Keyword.get(config, :enabled, false),
{:ok, url} <- Pleroma.Web.MediaProxy.decode_url(sig, url),
{:ok, content_type, body} <- proxy_request(url) do
conn
|> put_resp_content_type(content_type)
|> set_cache_header(:default)
|> send_resp(200, body)
def remote(conn, params = %{"sig" => sig64, "url" => url64}) do
with config <- Pleroma.Config.get([:media_proxy]),
true <- Keyword.get(config, :enabled, false),
{:ok, url} <- MediaProxy.decode_url(sig64, url64),
filename <- Path.basename(URI.parse(url).path),
:ok <- filename_matches(Map.has_key?(params, "filename"), conn.request_path, url) do
ReverseProxy.call(conn, url, Keyword.get(config, :proxy_opts, @default_proxy_length))
else
false ->
send_error(conn, 404)
send_resp(conn, 404, Plug.Conn.Status.reason_phrase(404))
{:error, :invalid_signature} ->
send_error(conn, 403)
send_resp(conn, 403, Plug.Conn.Status.reason_phrase(403))
{:error, {:http, _, url}} ->
redirect_or_error(conn, url, Keyword.get(config, :redirect_on_failure, true))
{:wrong_filename, filename} ->
redirect(conn, external: MediaProxy.build_url(sig64, url64, filename))
end
end
defp proxy_request(link) do
headers = [
{"user-agent",
"Pleroma/MediaProxy; #{Pleroma.Web.base_url()} <#{
Application.get_env(:pleroma, :instance)[:email]
}>"}
]
def filename_matches(has_filename, path, url) do
filename = MediaProxy.filename(url)
options =
@httpoison.process_request_options([:insecure, {:follow_redirect, true}]) ++
[{:pool, :default}]
with {:ok, 200, headers, client} <- :hackney.request(:get, link, headers, "", options),
headers = Enum.into(headers, Map.new()),
{:ok, body} <- proxy_request_body(client),
content_type <- proxy_request_content_type(headers, body) do
{:ok, content_type, body}
else
{:ok, status, _, _} ->
Logger.warn("MediaProxy: request failed, status #{status}, link: #{link}")
{:error, {:http, :bad_status, link}}
{:error, error} ->
Logger.warn("MediaProxy: request failed, error #{inspect(error)}, link: #{link}")
{:error, {:http, error, link}}
cond do
has_filename && filename && Path.basename(path) != filename -> {:wrong_filename, filename}
true -> :ok
end
end
defp set_cache_header(conn, key) do
Plug.Conn.put_resp_header(conn, "cache-control", @cache_control[key])
end
defp redirect_or_error(conn, url, true), do: redirect(conn, external: url)
defp redirect_or_error(conn, url, _), do: send_error(conn, 502, "Media proxy error: " <> url)
defp send_error(conn, code, body \\ "") do
conn
|> set_cache_header(:error)
|> send_resp(code, body)
end
defp proxy_request_body(client), do: proxy_request_body(client, <<>>)
defp proxy_request_body(client, body) when byte_size(body) < @max_body_length do
case :hackney.stream_body(client) do
{:ok, data} -> proxy_request_body(client, <<body::binary, data::binary>>)
:done -> {:ok, body}
{:error, reason} -> {:error, reason}
end
end
defp proxy_request_body(client, _) do
:hackney.close(client)
{:error, :body_too_large}
end
# TODO: the body is passed here as well because some hosts do not provide a content-type.
# At some point we may want to use magic numbers to discover the content-type and reply a proper one.
defp proxy_request_content_type(headers, _body) do
headers["Content-Type"] || headers["content-type"] || "image/jpeg"
end
end

View file

@ -3,6 +3,8 @@ defmodule Pleroma.Web.MediaProxy do
def url(nil), do: nil
def url(""), do: nil
def url(url = "/" <> _), do: url
def url(url) do
@ -15,7 +17,8 @@ def url(url) do
base64 = Base.url_encode64(url, @base64_opts)
sig = :crypto.hmac(:sha, secret, base64)
sig64 = sig |> Base.url_encode64(@base64_opts)
Keyword.get(config, :base_url, Pleroma.Web.base_url()) <> "/proxy/#{sig64}/#{base64}"
build_url(sig64, base64, filename(url))
end
end
@ -30,4 +33,20 @@ def decode_url(sig, url) do
{:error, :invalid_signature}
end
end
def filename(url_or_path) do
if path = URI.parse(url_or_path).path, do: Path.basename(path)
end
def build_url(sig_base64, url_base64, filename \\ nil) do
[
Pleroma.Config.get([:media_proxy, :base_url], Pleroma.Web.base_url()),
"proxy",
sig_base64,
url_base64,
filename
]
|> Enum.filter(fn value -> value end)
|> Path.join()
end
end

View file

@ -3,6 +3,11 @@ defmodule Pleroma.Web.Nodeinfo.NodeinfoController do
alias Pleroma.Stats
alias Pleroma.Web
alias Pleroma.{User, Repo}
alias Pleroma.Config
alias Pleroma.Web.ActivityPub.MRF
plug(Pleroma.Web.FederatingPlug)
def schemas(conn, _params) do
response = %{
@ -22,13 +27,73 @@ def nodeinfo(conn, %{"version" => "2.0"}) do
instance = Application.get_env(:pleroma, :instance)
media_proxy = Application.get_env(:pleroma, :media_proxy)
suggestions = Application.get_env(:pleroma, :suggestions)
chat = Application.get_env(:pleroma, :chat)
gopher = Application.get_env(:pleroma, :gopher)
stats = Stats.get_stats()
mrf_simple =
Application.get_env(:pleroma, :mrf_simple)
|> Enum.into(%{})
mrf_policies =
MRF.get_policies()
|> Enum.map(fn policy -> to_string(policy) |> String.split(".") |> List.last() end)
quarantined = Keyword.get(instance, :quarantined_instances)
quarantined =
if is_list(quarantined) do
quarantined
else
[]
end
staff_accounts =
User.moderator_user_query()
|> Repo.all()
|> Enum.map(fn u -> u.ap_id end)
mrf_user_allowlist =
Config.get([:mrf_user_allowlist], [])
|> Enum.into(%{}, fn {k, v} -> {k, length(v)} end)
mrf_transparency = Keyword.get(instance, :mrf_transparency)
federation_response =
if mrf_transparency do
%{
mrf_policies: mrf_policies,
mrf_simple: mrf_simple,
mrf_user_allowlist: mrf_user_allowlist,
quarantined_instances: quarantined
}
else
%{}
end
features = [
"pleroma_api",
"mastodon_api",
"mastodon_api_streaming",
if Keyword.get(media_proxy, :enabled) do
"media_proxy"
end,
if Keyword.get(gopher, :enabled) do
"gopher"
end,
if Keyword.get(chat, :enabled) do
"chat"
end,
if Keyword.get(suggestions, :enabled) do
"suggestions"
end
]
response = %{
version: "2.0",
software: %{
name: "pleroma",
version: Keyword.get(instance, :version)
name: Pleroma.Application.name(),
version: Pleroma.Application.version()
},
protocols: ["ostatus", "activitypub"],
services: %{
@ -45,14 +110,24 @@ def nodeinfo(conn, %{"version" => "2.0"}) do
metadata: %{
nodeName: Keyword.get(instance, :name),
nodeDescription: Keyword.get(instance, :description),
mediaProxy: Keyword.get(media_proxy, :enabled),
private: !Keyword.get(instance, :public, true),
suggestions: %{
enabled: Keyword.get(suggestions, :enabled, false),
thirdPartyEngine: Keyword.get(suggestions, :third_party_engine, ""),
timeout: Keyword.get(suggestions, :timeout, 5000),
limit: Keyword.get(suggestions, :limit, 23),
web: Keyword.get(suggestions, :web, "")
}
},
staffAccounts: staff_accounts,
federation: federation_response,
postFormats: Keyword.get(instance, :allowed_post_formats),
uploadLimits: %{
general: Keyword.get(instance, :upload_limit),
avatar: Keyword.get(instance, :avatar_upload_limit),
banner: Keyword.get(instance, :banner_upload_limit),
background: Keyword.get(instance, :background_upload_limit)
},
features: features
}
}

View file

@ -4,7 +4,7 @@ defmodule Pleroma.Web.OAuth.Authorization do
alias Pleroma.{User, Repo}
alias Pleroma.Web.OAuth.{Authorization, App}
import Ecto.{Changeset}
import Ecto.{Changeset, Query}
schema "oauth_authorizations" do
field(:token, :string)
@ -45,4 +45,12 @@ def use_token(%Authorization{used: false, valid_until: valid_until} = auth) do
end
def use_token(%Authorization{used: true}), do: {:error, "already used"}
def delete_user_authorizations(%User{id: user_id}) do
from(
a in Pleroma.Web.OAuth.Authorization,
where: a.user_id == ^user_id
)
|> Repo.delete_all()
end
end

View file

@ -33,22 +33,35 @@ def create_authorization(conn, %{
true <- Pbkdf2.checkpw(password, user.password_hash),
%App{} = app <- Repo.get_by(App, client_id: client_id),
{:ok, auth} <- Authorization.create_authorization(app, user) do
if redirect_uri == "urn:ietf:wg:oauth:2.0:oob" do
render(conn, "results.html", %{
auth: auth
})
else
connector = if String.contains?(redirect_uri, "?"), do: "&", else: "?"
url = "#{redirect_uri}#{connector}code=#{auth.token}"
# Special case: Local MastodonFE.
redirect_uri =
if redirect_uri == "." do
mastodon_api_url(conn, :login)
else
redirect_uri
end
url =
if params["state"] do
url <> "&state=#{params["state"]}"
else
url
end
cond do
redirect_uri == "urn:ietf:wg:oauth:2.0:oob" ->
render(conn, "results.html", %{
auth: auth
})
redirect(conn, external: url)
true ->
connector = if String.contains?(redirect_uri, "?"), do: "&", else: "?"
url = "#{redirect_uri}#{connector}"
url_params = %{:code => auth.token}
url_params =
if params["state"] do
Map.put(url_params, :state, params["state"])
else
url_params
end
url = "#{url}#{Plug.Conn.Query.encode(url_params)}"
redirect(conn, external: url)
end
end
end
@ -60,11 +73,13 @@ def token_exchange(conn, %{"grant_type" => "authorization_code"} = params) do
fixed_token = fix_padding(params["code"]),
%Authorization{} = auth <-
Repo.get_by(Authorization, token: fixed_token, app_id: app.id),
{:ok, token} <- Token.exchange_token(app, auth) do
{:ok, token} <- Token.exchange_token(app, auth),
{:ok, inserted_at} <- DateTime.from_naive(token.inserted_at, "Etc/UTC") do
response = %{
token_type: "Bearer",
access_token: token.token,
refresh_token: token.refresh_token,
created_at: DateTime.to_unix(inserted_at),
expires_in: 60 * 10,
scope: "read write follow"
}
@ -116,8 +131,23 @@ def token_exchange(
token_exchange(conn, params)
end
def token_revoke(conn, %{"token" => token} = params) do
with %App{} = app <- get_app_from_request(conn, params),
%Token{} = token <- Repo.get_by(Token, token: token, app_id: app.id),
{:ok, %Token{}} <- Repo.delete(token) do
json(conn, %{})
else
_error ->
# RFC 7009: invalid tokens [in the request] do not cause an error response
json(conn, %{})
end
end
# XXX - for whatever reason our token arrives urlencoded, but Plug.Conn should be
# decoding it. Investigate sometime.
defp fix_padding(token) do
token
|> URI.decode()
|> Base.url_decode64!(padding: false)
|> Base.url_encode64()
end

View file

@ -1,6 +1,8 @@
defmodule Pleroma.Web.OAuth.Token do
use Ecto.Schema
import Ecto.Query
alias Pleroma.{User, Repo}
alias Pleroma.Web.OAuth.{Token, App, Authorization}
@ -35,4 +37,12 @@ def create_token(%App{} = app, %User{} = user) do
Repo.insert(token)
end
def delete_user_tokens(%User{id: user_id}) do
from(
t in Pleroma.Web.OAuth.Token,
where: t.user_id == ^user_id
)
|> Repo.delete_all()
end
end

View file

@ -184,7 +184,10 @@ def to_simple_form(%{data: %{"type" => "Announce"}} = activity, user, with_autho
retweeted_xml = to_simple_form(retweeted_activity, retweeted_user, true)
mentions = activity.recipients |> get_mentions
mentions =
([retweeted_user.ap_id] ++ activity.recipients)
|> Enum.uniq()
|> get_mentions()
[
{:"activity:object-type", ['http://activitystrea.ms/schema/1.0/activity']},

View file

@ -11,6 +11,21 @@ defmodule Pleroma.Web.OStatus do
alias Pleroma.Web.OStatus.{FollowHandler, UnfollowHandler, NoteHandler, DeleteHandler}
alias Pleroma.Web.ActivityPub.Transmogrifier
def is_representable?(%Activity{data: data}) do
object = Object.normalize(data["object"])
cond do
is_nil(object) ->
false
object.data["type"] == "Note" ->
true
true ->
false
end
end
def feed_path(user) do
"#{user.ap_id}/feed.atom"
end

View file

@ -1,7 +1,7 @@
defmodule Pleroma.Web.OStatus.OStatusController do
use Pleroma.Web, :controller
alias Pleroma.{User, Activity}
alias Pleroma.{User, Activity, Object}
alias Pleroma.Web.OStatus.{FeedRepresenter, ActivityRepresenter}
alias Pleroma.Repo
alias Pleroma.Web.{OStatus, Federator}
@ -10,6 +10,7 @@ defmodule Pleroma.Web.OStatus.OStatusController do
alias Pleroma.Web.ActivityPub.ActivityPubController
alias Pleroma.Web.ActivityPub.ActivityPub
plug(Pleroma.Web.FederatingPlug when action in [:salmon_incoming])
action_fallback(:errors)
def feed_redirect(conn, %{"nickname" => nickname}) do
@ -135,7 +136,7 @@ def notice(conn, %{"id" => id}) do
"html" ->
conn
|> put_resp_content_type("text/html")
|> send_file(200, "priv/static/index.html")
|> send_file(200, Application.app_dir(:pleroma, "priv/static/index.html"))
_ ->
represent_activity(conn, format, activity, user)
@ -152,10 +153,21 @@ def notice(conn, %{"id" => id}) do
end
end
defp represent_activity(conn, "activity+json", activity, user) do
defp represent_activity(
conn,
"activity+json",
%Activity{data: %{"type" => "Create"}} = activity,
user
) do
object = Object.normalize(activity.data["object"])
conn
|> put_resp_header("content-type", "application/activity+json")
|> json(ObjectView.render("object.json", %{object: activity}))
|> json(ObjectView.render("object.json", %{object: object}))
end
defp represent_activity(conn, "activity+json", _, _) do
{:error, :not_found}
end
defp represent_activity(conn, _, activity, user) do

View file

@ -3,41 +3,72 @@ defmodule Pleroma.Web.Router do
alias Pleroma.{Repo, User, Web.Router}
@instance Application.get_env(:pleroma, :instance)
@federating Keyword.get(@instance, :federating)
@public Keyword.get(@instance, :public)
@registrations_open Keyword.get(@instance, :registrations_open)
def user_fetcher(username) do
{:ok, Repo.get_by(User, %{nickname: username})}
end
pipeline :api do
plug(:accepts, ["json"])
plug(:fetch_session)
plug(Pleroma.Plugs.OAuthPlug)
plug(Pleroma.Plugs.AuthenticationPlug, %{fetcher: &Router.user_fetcher/1, optional: true})
plug(Pleroma.Plugs.BasicAuthDecoderPlug)
plug(Pleroma.Plugs.UserFetcherPlug)
plug(Pleroma.Plugs.SessionAuthenticationPlug)
plug(Pleroma.Plugs.LegacyAuthenticationPlug)
plug(Pleroma.Plugs.AuthenticationPlug)
plug(Pleroma.Plugs.UserEnabledPlug)
plug(Pleroma.Plugs.SetUserSessionIdPlug)
plug(Pleroma.Plugs.EnsureUserKeyPlug)
end
pipeline :authenticated_api do
plug(:accepts, ["json"])
plug(:fetch_session)
plug(Pleroma.Plugs.OAuthPlug)
plug(Pleroma.Plugs.AuthenticationPlug, %{fetcher: &Router.user_fetcher/1})
plug(Pleroma.Plugs.BasicAuthDecoderPlug)
plug(Pleroma.Plugs.UserFetcherPlug)
plug(Pleroma.Plugs.SessionAuthenticationPlug)
plug(Pleroma.Plugs.LegacyAuthenticationPlug)
plug(Pleroma.Plugs.AuthenticationPlug)
plug(Pleroma.Plugs.UserEnabledPlug)
plug(Pleroma.Plugs.SetUserSessionIdPlug)
plug(Pleroma.Plugs.EnsureAuthenticatedPlug)
end
pipeline :admin_api do
plug(:accepts, ["json"])
plug(:fetch_session)
plug(Pleroma.Plugs.OAuthPlug)
plug(Pleroma.Plugs.BasicAuthDecoderPlug)
plug(Pleroma.Plugs.UserFetcherPlug)
plug(Pleroma.Plugs.SessionAuthenticationPlug)
plug(Pleroma.Plugs.LegacyAuthenticationPlug)
plug(Pleroma.Plugs.AuthenticationPlug)
plug(Pleroma.Plugs.UserEnabledPlug)
plug(Pleroma.Plugs.SetUserSessionIdPlug)
plug(Pleroma.Plugs.EnsureAuthenticatedPlug)
plug(Pleroma.Plugs.UserIsAdminPlug)
end
pipeline :mastodon_html do
plug(:accepts, ["html"])
plug(:fetch_session)
plug(Pleroma.Plugs.OAuthPlug)
plug(Pleroma.Plugs.AuthenticationPlug, %{fetcher: &Router.user_fetcher/1, optional: true})
plug(Pleroma.Plugs.BasicAuthDecoderPlug)
plug(Pleroma.Plugs.UserFetcherPlug)
plug(Pleroma.Plugs.SessionAuthenticationPlug)
plug(Pleroma.Plugs.LegacyAuthenticationPlug)
plug(Pleroma.Plugs.AuthenticationPlug)
plug(Pleroma.Plugs.UserEnabledPlug)
plug(Pleroma.Plugs.SetUserSessionIdPlug)
plug(Pleroma.Plugs.EnsureUserKeyPlug)
end
pipeline :pleroma_html do
plug(:accepts, ["html"])
plug(:fetch_session)
plug(Pleroma.Plugs.OAuthPlug)
plug(Pleroma.Plugs.AuthenticationPlug, %{fetcher: &Router.user_fetcher/1, optional: true})
plug(Pleroma.Plugs.BasicAuthDecoderPlug)
plug(Pleroma.Plugs.UserFetcherPlug)
plug(Pleroma.Plugs.SessionAuthenticationPlug)
plug(Pleroma.Plugs.AuthenticationPlug)
plug(Pleroma.Plugs.EnsureUserKeyPlug)
end
pipeline :well_known do
@ -63,6 +94,23 @@ def user_fetcher(username) do
get("/emoji", UtilController, :emoji)
end
scope "/api/pleroma/admin", Pleroma.Web.AdminAPI do
pipe_through(:admin_api)
delete("/user", AdminAPIController, :user_delete)
post("/user", AdminAPIController, :user_create)
get("/permission_group/:nickname", AdminAPIController, :right_get)
get("/permission_group/:nickname/:permission_group", AdminAPIController, :right_get)
post("/permission_group/:nickname/:permission_group", AdminAPIController, :right_add)
delete("/permission_group/:nickname/:permission_group", AdminAPIController, :right_delete)
post("/relay", AdminAPIController, :relay_follow)
delete("/relay", AdminAPIController, :relay_unfollow)
get("/invite_token", AdminAPIController, :get_invite_token)
get("/password_reset", AdminAPIController, :get_password_reset)
end
scope "/", Pleroma.Web.TwitterAPI do
pipe_through(:pleroma_html)
get("/ostatus_subscribe", UtilController, :remote_follow)
@ -81,6 +129,7 @@ def user_fetcher(username) do
get("/authorize", OAuthController, :authorize)
post("/authorize", OAuthController, :create_authorization)
post("/token", OAuthController, :token_exchange)
post("/revoke", OAuthController, :token_revoke)
end
scope "/api/v1", Pleroma.Web.MastodonAPI do
@ -96,6 +145,7 @@ def user_fetcher(username) do
post("/accounts/:id/unblock", MastodonAPIController, :unblock)
post("/accounts/:id/mute", MastodonAPIController, :relationship_noop)
post("/accounts/:id/unmute", MastodonAPIController, :relationship_noop)
get("/accounts/:id/lists", MastodonAPIController, :account_lists)
get("/follow_requests", MastodonAPIController, :follow_requests)
post("/follow_requests/:id/authorize", MastodonAPIController, :authorize_follow_request)
@ -142,7 +192,15 @@ def user_fetcher(username) do
post("/domain_blocks", MastodonAPIController, :block_domain)
delete("/domain_blocks", MastodonAPIController, :unblock_domain)
get("/filters", MastodonAPIController, :get_filters)
post("/filters", MastodonAPIController, :create_filter)
get("/filters/:id", MastodonAPIController, :get_filter)
put("/filters/:id", MastodonAPIController, :update_filter)
delete("/filters/:id", MastodonAPIController, :delete_filter)
get("/suggestions", MastodonAPIController, :suggestions)
get("/endorsements", MastodonAPIController, :empty_array)
end
scope "/api/web", Pleroma.Web.MastodonAPI do
@ -211,11 +269,7 @@ def user_fetcher(username) do
end
scope "/api", Pleroma.Web do
if @public do
pipe_through(:api)
else
pipe_through(:authenticated_api)
end
pipe_through(:api)
get("/statuses/public_timeline", TwitterAPI.Controller, :public_timeline)
@ -228,7 +282,12 @@ def user_fetcher(username) do
get("/statuses/networkpublic_timeline", TwitterAPI.Controller, :public_and_external_timeline)
end
scope "/api", Pleroma.Web do
scope "/api", Pleroma.Web, as: :twitter_api_search do
pipe_through(:api)
get("/pleroma/search_user", TwitterAPI.Controller, :search_user)
end
scope "/api", Pleroma.Web, as: :authenticated_twitter_api do
pipe_through(:authenticated_api)
get("/account/verify_credentials", TwitterAPI.Controller, :verify_credentials)
@ -248,8 +307,13 @@ def user_fetcher(username) do
get("/statuses/friends_timeline", TwitterAPI.Controller, :friends_timeline)
get("/statuses/mentions", TwitterAPI.Controller, :mentions_timeline)
get("/statuses/mentions_timeline", TwitterAPI.Controller, :mentions_timeline)
get("/statuses/dm_timeline", TwitterAPI.Controller, :dm_timeline)
get("/qvitter/statuses/notifications", TwitterAPI.Controller, :notifications)
# XXX: this is really a pleroma API, but we want to keep the pleroma namespace clean
# for now.
post("/qvitter/statuses/notifications/read", TwitterAPI.Controller, :notifications_read)
post("/statuses/update", TwitterAPI.Controller, :status_update)
post("/statuses/retweet/:id", TwitterAPI.Controller, :retweet)
post("/statuses/unretweet/:id", TwitterAPI.Controller, :unretweet)
@ -282,6 +346,10 @@ def user_fetcher(username) do
get("/externalprofile/show", TwitterAPI.Controller, :external_profile)
end
pipeline :ap_relay do
plug(:accepts, ["activity+json"])
end
pipeline :ostatus do
plug(:accepts, ["xml", "atom", "html", "activity+json"])
end
@ -295,12 +363,10 @@ def user_fetcher(username) do
get("/users/:nickname/feed", OStatus.OStatusController, :feed)
get("/users/:nickname", OStatus.OStatusController, :feed_redirect)
if @federating do
post("/users/:nickname/salmon", OStatus.OStatusController, :salmon_incoming)
post("/push/hub/:nickname", Websub.WebsubController, :websub_subscription_request)
get("/push/subscriptions/:id", Websub.WebsubController, :websub_subscription_confirmation)
post("/push/subscriptions/:id", Websub.WebsubController, :websub_incoming)
end
post("/users/:nickname/salmon", OStatus.OStatusController, :salmon_incoming)
post("/push/hub/:nickname", Websub.WebsubController, :websub_subscription_request)
get("/push/subscriptions/:id", Websub.WebsubController, :websub_subscription_confirmation)
post("/push/subscriptions/:id", Websub.WebsubController, :websub_incoming)
end
pipeline :activitypub do
@ -317,24 +383,27 @@ def user_fetcher(username) do
get("/users/:nickname/outbox", ActivityPubController, :outbox)
end
if @federating do
scope "/", Pleroma.Web.ActivityPub do
pipe_through(:activitypub)
post("/users/:nickname/inbox", ActivityPubController, :inbox)
post("/inbox", ActivityPubController, :inbox)
end
scope "/relay", Pleroma.Web.ActivityPub do
pipe_through(:ap_relay)
get("/", ActivityPubController, :relay)
end
scope "/.well-known", Pleroma.Web do
pipe_through(:well_known)
scope "/", Pleroma.Web.ActivityPub do
pipe_through(:activitypub)
post("/users/:nickname/inbox", ActivityPubController, :inbox)
post("/inbox", ActivityPubController, :inbox)
end
get("/host-meta", WebFinger.WebFingerController, :host_meta)
get("/webfinger", WebFinger.WebFingerController, :webfinger)
get("/nodeinfo", Nodeinfo.NodeinfoController, :schemas)
end
scope "/.well-known", Pleroma.Web do
pipe_through(:well_known)
scope "/nodeinfo", Pleroma.Web do
get("/:version", Nodeinfo.NodeinfoController, :nodeinfo)
end
get("/host-meta", WebFinger.WebFingerController, :host_meta)
get("/webfinger", WebFinger.WebFingerController, :webfinger)
get("/nodeinfo", Nodeinfo.NodeinfoController, :schemas)
end
scope "/nodeinfo", Pleroma.Web do
get("/:version", Nodeinfo.NodeinfoController, :nodeinfo)
end
scope "/", Pleroma.Web.MastodonAPI do
@ -347,17 +416,19 @@ def user_fetcher(username) do
end
pipeline :remote_media do
plug(:accepts, ["html"])
end
scope "/proxy/", Pleroma.Web.MediaProxy do
pipe_through(:remote_media)
get("/:sig/:url", MediaProxyController, :remote)
get("/:sig/:url/:filename", MediaProxyController, :remote)
end
scope "/", Fallback do
get("/registration/:token", RedirectController, :registration_page)
get("/*path", RedirectController, :redirector)
options("/*path", RedirectController, :empty)
end
end
@ -365,14 +436,18 @@ defmodule Fallback.RedirectController do
use Pleroma.Web, :controller
def redirector(conn, _params) do
if Mix.env() != :test do
conn
|> put_resp_content_type("text/html")
|> send_file(200, "priv/static/index.html")
end
conn
|> put_resp_content_type("text/html")
|> send_file(200, Application.app_dir(:pleroma, "priv/static/index.html"))
end
def registration_page(conn, params) do
redirector(conn, params)
end
def empty(conn, _params) do
conn
|> put_status(204)
|> text("")
end
end

View file

@ -1,7 +1,8 @@
defmodule Pleroma.Web.Streamer do
use GenServer
require Logger
alias Pleroma.{User, Notification, Activity, Object}
alias Pleroma.{User, Notification, Activity, Object, Repo}
alias Pleroma.Web.ActivityPub.ActivityPub
def init(args) do
{:ok, args}
@ -60,8 +61,25 @@ def handle_cast(%{action: :stream, topic: "direct", item: item}, topics) do
end
def handle_cast(%{action: :stream, topic: "list", item: item}, topics) do
author = User.get_cached_by_ap_id(item.data["actor"])
# filter the recipient list if the activity is not public, see #270.
recipient_lists =
case ActivityPub.is_public?(item) do
true ->
Pleroma.List.get_lists_from_activity(item)
_ ->
Pleroma.List.get_lists_from_activity(item)
|> Enum.filter(fn list ->
owner = Repo.get(User, list.user_id)
ActivityPub.visible_for_user?(item, owner)
end)
end
recipient_topics =
Pleroma.List.get_lists_from_activity(item)
recipient_lists
|> Enum.map(fn %{id: id} -> "list:#{id}" end)
Enum.each(recipient_topics || [], fn list_topic ->
@ -152,16 +170,33 @@ defp represent_update(%Activity{} = activity, %User{} = user) do
|> Jason.encode!()
end
defp represent_update(%Activity{} = activity) do
%{
event: "update",
payload:
Pleroma.Web.MastodonAPI.StatusView.render(
"status.json",
activity: activity
)
|> Jason.encode!()
}
|> Jason.encode!()
end
def push_to_socket(topics, topic, %Activity{data: %{"type" => "Announce"}} = item) do
Enum.each(topics[topic] || [], fn socket ->
# Get the current user so we have up-to-date blocks etc.
user = User.get_cached_by_ap_id(socket.assigns[:user].ap_id)
blocks = user.info["blocks"] || []
if socket.assigns[:user] do
user = User.get_cached_by_ap_id(socket.assigns[:user].ap_id)
blocks = user.info["blocks"] || []
parent = Object.normalize(item.data["object"])
parent = Object.normalize(item.data["object"])
unless is_nil(parent) or item.actor in blocks or parent.data["actor"] in blocks do
send(socket.transport_pid, {:text, represent_update(item, user)})
unless is_nil(parent) or item.actor in blocks or parent.data["actor"] in blocks do
send(socket.transport_pid, {:text, represent_update(item, user)})
end
else
send(socket.transport_pid, {:text, represent_update(item)})
end
end)
end
@ -169,11 +204,15 @@ def push_to_socket(topics, topic, %Activity{data: %{"type" => "Announce"}} = ite
def push_to_socket(topics, topic, item) do
Enum.each(topics[topic] || [], fn socket ->
# Get the current user so we have up-to-date blocks etc.
user = User.get_cached_by_ap_id(socket.assigns[:user].ap_id)
blocks = user.info["blocks"] || []
if socket.assigns[:user] do
user = User.get_cached_by_ap_id(socket.assigns[:user].ap_id)
blocks = user.info["blocks"] || []
unless item.actor in blocks do
send(socket.transport_pid, {:text, represent_update(item, user)})
unless item.actor in blocks do
send(socket.transport_pid, {:text, represent_update(item, user)})
end
else
send(socket.transport_pid, {:text, represent_update(item)})
end
end)
end

View file

@ -2,7 +2,9 @@
<html>
<head>
<meta charset=utf-8 />
<title>Pleroma</title>
<title>
<%= Application.get_env(:pleroma, :instance)[:name] %>
</title>
<style>
body {
background-color: #282c37;

View file

@ -1,11 +0,0 @@
<h2>Login to Mastodon Frontend</h2>
<%= if @error do %>
<h2><%= @error %></h2>
<% end %>
<%= form_for @conn, mastodon_api_path(@conn, :login), [as: "authorization"], fn f -> %>
<%= text_input f, :name, placeholder: "Username or email" %>
<br>
<%= password_input f, :password, placeholder: "Password" %>
<br>
<%= submit "Log in" %>
<% end %>

View file

@ -6,7 +6,7 @@ defmodule Pleroma.Web.TwitterAPI.UtilController do
alias Pleroma.Web.WebFinger
alias Pleroma.Web.CommonAPI
alias Comeonin.Pbkdf2
alias Pleroma.Formatter
alias Pleroma.{Formatter, Emoji}
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.{Repo, PasswordResetToken, User}
@ -134,19 +134,20 @@ def do_remote_follow(%{assigns: %{user: user}} = conn, %{"user" => %{"id" => id}
end
end
@instance Application.get_env(:pleroma, :instance)
@instance_fe Application.get_env(:pleroma, :fe)
@instance_chat Application.get_env(:pleroma, :chat)
def config(conn, _params) do
instance = Pleroma.Config.get(:instance)
instance_fe = Pleroma.Config.get(:fe)
instance_chat = Pleroma.Config.get(:chat)
case get_format(conn) do
"xml" ->
response = """
<config>
<site>
<name>#{Keyword.get(@instance, :name)}</name>
<name>#{Keyword.get(instance, :name)}</name>
<site>#{Web.base_url()}</site>
<textlimit>#{Keyword.get(@instance, :limit)}</textlimit>
<closed>#{!Keyword.get(@instance, :registrations_open)}</closed>
<textlimit>#{Keyword.get(instance, :limit)}</textlimit>
<closed>#{!Keyword.get(instance, :registrations_open)}</closed>
</site>
</config>
"""
@ -156,34 +157,47 @@ def config(conn, _params) do
|> send_resp(200, response)
_ ->
json(conn, %{
site: %{
name: Keyword.get(@instance, :name),
description: Keyword.get(@instance, :description),
server: Web.base_url(),
textlimit: to_string(Keyword.get(@instance, :limit)),
closed: if(Keyword.get(@instance, :registrations_open), do: "0", else: "1"),
private: if(Keyword.get(@instance, :public, true), do: "0", else: "1"),
pleromafe: %{
theme: Keyword.get(@instance_fe, :theme),
background: Keyword.get(@instance_fe, :background),
logo: Keyword.get(@instance_fe, :logo),
redirectRootNoLogin: Keyword.get(@instance_fe, :redirect_root_no_login),
redirectRootLogin: Keyword.get(@instance_fe, :redirect_root_login),
chatDisabled: !Keyword.get(@instance_chat, :enabled),
showInstanceSpecificPanel: Keyword.get(@instance_fe, :show_instance_panel),
showWhoToFollowPanel: Keyword.get(@instance_fe, :show_who_to_follow_panel),
scopeOptionsEnabled: Keyword.get(@instance_fe, :scope_options_enabled),
whoToFollowProvider: Keyword.get(@instance_fe, :who_to_follow_provider),
whoToFollowLink: Keyword.get(@instance_fe, :who_to_follow_link)
}
}
})
data = %{
name: Keyword.get(instance, :name),
description: Keyword.get(instance, :description),
server: Web.base_url(),
textlimit: to_string(Keyword.get(instance, :limit)),
closed: if(Keyword.get(instance, :registrations_open), do: "0", else: "1"),
private: if(Keyword.get(instance, :public, true), do: "0", else: "1")
}
pleroma_fe = %{
theme: Keyword.get(instance_fe, :theme),
background: Keyword.get(instance_fe, :background),
logo: Keyword.get(instance_fe, :logo),
logoMask: Keyword.get(instance_fe, :logo_mask),
logoMargin: Keyword.get(instance_fe, :logo_margin),
redirectRootNoLogin: Keyword.get(instance_fe, :redirect_root_no_login),
redirectRootLogin: Keyword.get(instance_fe, :redirect_root_login),
chatDisabled: !Keyword.get(instance_chat, :enabled),
showInstanceSpecificPanel: Keyword.get(instance_fe, :show_instance_panel),
scopeOptionsEnabled: Keyword.get(instance_fe, :scope_options_enabled),
formattingOptionsEnabled: Keyword.get(instance_fe, :formatting_options_enabled),
collapseMessageWithSubject: Keyword.get(instance_fe, :collapse_message_with_subject),
hidePostStats: Keyword.get(instance_fe, :hide_post_stats),
hideUserStats: Keyword.get(instance_fe, :hide_user_stats)
}
managed_config = Keyword.get(instance, :managed_config)
data =
if managed_config do
data |> Map.put("pleromafe", pleroma_fe)
else
data
end
json(conn, %{site: data})
end
end
def version(conn, _params) do
version = Keyword.get(@instance, :version)
version = Pleroma.Application.named_version()
case get_format(conn) do
"xml" ->
@ -199,7 +213,7 @@ def version(conn, _params) do
end
def emoji(conn, _params) do
json(conn, Enum.into(Formatter.get_custom_emoji(), %{}))
json(conn, Enum.into(Emoji.get_all(), %{}))
end
def follow_import(conn, %{"list" => %Plug.Upload{} = listfile}) do
@ -212,7 +226,7 @@ def follow_import(%{assigns: %{user: user}} = conn, %{"list" => list}) do
|> Enum.map(fn account ->
with %User{} = follower <- User.get_cached_by_ap_id(user.ap_id),
%User{} = followed <- User.get_or_fetch(account),
{:ok, follower} <- User.follow(follower, followed) do
{:ok, follower} <- User.maybe_direct_follow(follower, followed) do
ActivityPub.follow(follower, followed)
else
err -> Logger.debug("follow_import: following #{account} failed with #{inspect(err)}")

View file

@ -7,6 +7,7 @@ defmodule Pleroma.Web.TwitterAPI.Representers.ActivityRepresenter do
alias Pleroma.Web.TwitterAPI.{TwitterAPI, UserView, ActivityView}
alias Pleroma.Web.CommonAPI.Utils
alias Pleroma.Formatter
alias Pleroma.HTML
defp user_by_ap_id(user_list, ap_id) do
Enum.find(user_list, fn %{ap_id: user_id} -> ap_id == user_id end)
@ -167,7 +168,7 @@ def to_map(
{summary, content} = ActivityView.render_content(object)
html =
HtmlSanitizeEx.basic_html(content)
HTML.filter_tags(content, User.html_filter_policy(opts[:for]))
|> Formatter.emojify(object["emoji"])
video =
@ -179,16 +180,24 @@ def to_map(
attachments = (object["attachment"] || []) ++ video
reply_parent = Activity.get_in_reply_to_activity(activity)
reply_user = reply_parent && User.get_cached_by_ap_id(reply_parent.actor)
%{
"id" => activity.id,
"uri" => activity.data["object"]["id"],
"user" => UserView.render("show.json", %{user: user, for: opts[:for]}),
"statusnet_html" => html,
"text" => HtmlSanitizeEx.strip_tags(content),
"text" => HTML.strip_tags(content),
"is_local" => activity.local,
"is_post_verb" => true,
"created_at" => created_at,
"in_reply_to_status_id" => object["inReplyToStatusId"],
"in_reply_to_screen_name" => reply_user && reply_user.nickname,
"in_reply_to_profileurl" => User.profile_url(reply_user),
"in_reply_to_ostatus_uri" => reply_user && reply_user.ap_id,
"in_reply_to_user_id" => reply_user && reply_user.id,
"statusnet_conversation_id" => conversation_id,
"attachments" => attachments |> ObjectRepresenter.enum_to_list(opts),
"attentions" => attentions,

View file

@ -9,16 +9,18 @@ def to_map(%Object{data: %{"url" => [url | _]}} = object, _opts) do
url: url["href"] |> Pleroma.Web.MediaProxy.url(),
mimetype: url["mediaType"] || url["mimeType"],
id: data["uuid"],
oembed: false
oembed: false,
description: data["name"]
}
end
def to_map(%Object{data: %{"url" => url} = data}, _opts) when is_binary(url) do
%{
url: url |> Pleroma.Web.MediaProxy.url(),
mimetype: data["mediaType"] || url["mimeType"],
mimetype: data["mediaType"] || data["mimeType"],
id: data["uuid"],
oembed: false
oembed: false,
description: data["name"]
}
end

View file

@ -3,11 +3,10 @@ defmodule Pleroma.Web.TwitterAPI.TwitterAPI do
alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.TwitterAPI.UserView
alias Pleroma.Web.{OStatus, CommonAPI}
alias Pleroma.Web.MediaProxy
import Ecto.Query
@instance Application.get_env(:pleroma, :instance)
@httpoison Application.get_env(:pleroma, :httpoison)
@registrations_open Keyword.get(@instance, :registrations_open)
def create_status(%User{} = user, %{"status" => _} = data) do
CommonAPI.post(user, data)
@ -23,7 +22,13 @@ def delete(%User{} = user, id) do
def follow(%User{} = follower, params) do
with {:ok, %User{} = followed} <- get_user(params),
{:ok, follower} <- User.maybe_direct_follow(follower, followed),
{:ok, activity} <- ActivityPub.follow(follower, followed) do
{:ok, activity} <- ActivityPub.follow(follower, followed),
{:ok, follower, followed} <-
User.wait_and_refresh(
Pleroma.Config.get([:activitypub, :follow_handshake_timeout]),
follower,
followed
) do
{:ok, follower, followed, activity}
else
err -> err
@ -133,18 +138,20 @@ def register_user(params) do
password_confirmation: params["confirm"]
}
registrations_open = Pleroma.Config.get([:instance, :registrations_open])
# no need to query DB if registration is open
token =
unless @registrations_open || is_nil(tokenString) do
unless registrations_open || is_nil(tokenString) do
Repo.get_by(UserInviteToken, %{token: tokenString})
end
cond do
@registrations_open || (!is_nil(token) && !token.used) ->
registrations_open || (!is_nil(token) && !token.used) ->
changeset = User.register_changeset(%User{}, params)
with {:ok, user} <- Repo.insert(changeset) do
!@registrations_open && UserInviteToken.mark_as_used(token.token)
!registrations_open && UserInviteToken.mark_as_used(token.token)
{:ok, user}
else
{:error, changeset} ->
@ -155,10 +162,10 @@ def register_user(params) do
{:error, %{error: errors}}
end
!@registrations_open && is_nil(token) ->
!registrations_open && is_nil(token) ->
{:error, "Invalid token"}
!@registrations_open && token.used ->
!registrations_open && token.used ->
{:error, "Expired token"}
end
end

Some files were not shown because too many files have changed in this diff Show more