diff --git a/CC-BY-4.0 b/CC-BY-4.0 new file mode 100644 index 000000000..4ea99c213 --- /dev/null +++ b/CC-BY-4.0 @@ -0,0 +1,395 @@ +Attribution 4.0 International + +======================================================================= + +Creative Commons Corporation ("Creative Commons") is not a law firm and +does not provide legal services or legal advice. Distribution of +Creative Commons public licenses does not create a lawyer-client or +other relationship. Creative Commons makes its licenses and related +information available on an "as-is" basis. Creative Commons gives no +warranties regarding its licenses, any material licensed under their +terms and conditions, or any related information. Creative Commons +disclaims all liability for damages resulting from their use to the +fullest extent possible. + +Using Creative Commons Public Licenses + +Creative Commons public licenses provide a standard set of terms and +conditions that creators and other rights holders may use to share +original works of authorship and other material subject to copyright +and certain other rights specified in the public license below. The +following considerations are for informational purposes only, are not +exhaustive, and do not form part of our licenses. + + Considerations for licensors: Our public licenses are + intended for use by those authorized to give the public + permission to use material in ways otherwise restricted by + copyright and certain other rights. Our licenses are + irrevocable. Licensors should read and understand the terms + and conditions of the license they choose before applying it. + Licensors should also secure all rights necessary before + applying our licenses so that the public can reuse the + material as expected. Licensors should clearly mark any + material not subject to the license. This includes other CC- + licensed material, or material used under an exception or + limitation to copyright. More considerations for licensors: + wiki.creativecommons.org/Considerations_for_licensors + + Considerations for the public: By using one of our public + licenses, a licensor grants the public permission to use the + licensed material under specified terms and conditions. If + the licensor's permission is not necessary for any reason--for + example, because of any applicable exception or limitation to + copyright--then that use is not regulated by the license. Our + licenses grant only permissions under copyright and certain + other rights that a licensor has authority to grant. Use of + the licensed material may still be restricted for other + reasons, including because others have copyright or other + rights in the material. A licensor may make special requests, + such as asking that all changes be marked or described. + Although not required by our licenses, you are encouraged to + respect those requests where reasonable. More considerations + for the public: + wiki.creativecommons.org/Considerations_for_licensees + +======================================================================= + +Creative Commons Attribution 4.0 International Public License + +By exercising the Licensed Rights (defined below), You accept and agree +to be bound by the terms and conditions of this Creative Commons +Attribution 4.0 International Public License ("Public License"). To the +extent this Public License may be interpreted as a contract, You are +granted the Licensed Rights in consideration of Your acceptance of +these terms and conditions, and the Licensor grants You such rights in +consideration of benefits the Licensor receives from making the +Licensed Material available under these terms and conditions. + + +Section 1 -- Definitions. + + a. Adapted Material means material subject to Copyright and Similar + Rights that is derived from or based upon the Licensed Material + and in which the Licensed Material is translated, altered, + arranged, transformed, or otherwise modified in a manner requiring + permission under the Copyright and Similar Rights held by the + Licensor. For purposes of this Public License, where the Licensed + Material is a musical work, performance, or sound recording, + Adapted Material is always produced where the Licensed Material is + synched in timed relation with a moving image. + + b. Adapter's License means the license You apply to Your Copyright + and Similar Rights in Your contributions to Adapted Material in + accordance with the terms and conditions of this Public License. + + c. Copyright and Similar Rights means copyright and/or similar rights + closely related to copyright including, without limitation, + performance, broadcast, sound recording, and Sui Generis Database + Rights, without regard to how the rights are labeled or + categorized. For purposes of this Public License, the rights + specified in Section 2(b)(1)-(2) are not Copyright and Similar + Rights. + + d. Effective Technological Measures means those measures that, in the + absence of proper authority, may not be circumvented under laws + fulfilling obligations under Article 11 of the WIPO Copyright + Treaty adopted on December 20, 1996, and/or similar international + agreements. + + e. Exceptions and Limitations means fair use, fair dealing, and/or + any other exception or limitation to Copyright and Similar Rights + that applies to Your use of the Licensed Material. + + f. Licensed Material means the artistic or literary work, database, + or other material to which the Licensor applied this Public + License. + + g. Licensed Rights means the rights granted to You subject to the + terms and conditions of this Public License, which are limited to + all Copyright and Similar Rights that apply to Your use of the + Licensed Material and that the Licensor has authority to license. + + h. Licensor means the individual(s) or entity(ies) granting rights + under this Public License. + + i. Share means to provide material to the public by any means or + process that requires permission under the Licensed Rights, such + as reproduction, public display, public performance, distribution, + dissemination, communication, or importation, and to make material + available to the public including in ways that members of the + public may access the material from a place and at a time + individually chosen by them. + + j. Sui Generis Database Rights means rights other than copyright + resulting from Directive 96/9/EC of the European Parliament and of + the Council of 11 March 1996 on the legal protection of databases, + as amended and/or succeeded, as well as other essentially + equivalent rights anywhere in the world. + + k. You means the individual or entity exercising the Licensed Rights + under this Public License. Your has a corresponding meaning. + + +Section 2 -- Scope. + + a. License grant. + + 1. Subject to the terms and conditions of this Public License, + the Licensor hereby grants You a worldwide, royalty-free, + non-sublicensable, non-exclusive, irrevocable license to + exercise the Licensed Rights in the Licensed Material to: + + a. reproduce and Share the Licensed Material, in whole or + in part; and + + b. produce, reproduce, and Share Adapted Material. + + 2. Exceptions and Limitations. For the avoidance of doubt, where + Exceptions and Limitations apply to Your use, this Public + License does not apply, and You do not need to comply with + its terms and conditions. + + 3. Term. The term of this Public License is specified in Section + 6(a). + + 4. Media and formats; technical modifications allowed. The + Licensor authorizes You to exercise the Licensed Rights in + all media and formats whether now known or hereafter created, + and to make technical modifications necessary to do so. The + Licensor waives and/or agrees not to assert any right or + authority to forbid You from making technical modifications + necessary to exercise the Licensed Rights, including + technical modifications necessary to circumvent Effective + Technological Measures. For purposes of this Public License, + simply making modifications authorized by this Section 2(a) + (4) never produces Adapted Material. + + 5. Downstream recipients. + + a. Offer from the Licensor -- Licensed Material. Every + recipient of the Licensed Material automatically + receives an offer from the Licensor to exercise the + Licensed Rights under the terms and conditions of this + Public License. + + b. No downstream restrictions. You may not offer or impose + any additional or different terms or conditions on, or + apply any Effective Technological Measures to, the + Licensed Material if doing so restricts exercise of the + Licensed Rights by any recipient of the Licensed + Material. + + 6. No endorsement. Nothing in this Public License constitutes or + may be construed as permission to assert or imply that You + are, or that Your use of the Licensed Material is, connected + with, or sponsored, endorsed, or granted official status by, + the Licensor or others designated to receive attribution as + provided in Section 3(a)(1)(A)(i). + + b. Other rights. + + 1. Moral rights, such as the right of integrity, are not + licensed under this Public License, nor are publicity, + privacy, and/or other similar personality rights; however, to + the extent possible, the Licensor waives and/or agrees not to + assert any such rights held by the Licensor to the limited + extent necessary to allow You to exercise the Licensed + Rights, but not otherwise. + + 2. Patent and trademark rights are not licensed under this + Public License. + + 3. To the extent possible, the Licensor waives any right to + collect royalties from You for the exercise of the Licensed + Rights, whether directly or through a collecting society + under any voluntary or waivable statutory or compulsory + licensing scheme. In all other cases the Licensor expressly + reserves any right to collect such royalties. + + +Section 3 -- License Conditions. + +Your exercise of the Licensed Rights is expressly made subject to the +following conditions. + + a. Attribution. + + 1. If You Share the Licensed Material (including in modified + form), You must: + + a. retain the following if it is supplied by the Licensor + with the Licensed Material: + + i. identification of the creator(s) of the Licensed + Material and any others designated to receive + attribution, in any reasonable manner requested by + the Licensor (including by pseudonym if + designated); + + ii. a copyright notice; + + iii. a notice that refers to this Public License; + + iv. a notice that refers to the disclaimer of + warranties; + + v. a URI or hyperlink to the Licensed Material to the + extent reasonably practicable; + + b. indicate if You modified the Licensed Material and + retain an indication of any previous modifications; and + + c. indicate the Licensed Material is licensed under this + Public License, and include the text of, or the URI or + hyperlink to, this Public License. + + 2. You may satisfy the conditions in Section 3(a)(1) in any + reasonable manner based on the medium, means, and context in + which You Share the Licensed Material. For example, it may be + reasonable to satisfy the conditions by providing a URI or + hyperlink to a resource that includes the required + information. + + 3. If requested by the Licensor, You must remove any of the + information required by Section 3(a)(1)(A) to the extent + reasonably practicable. + + 4. If You Share Adapted Material You produce, the Adapter's + License You apply must not prevent recipients of the Adapted + Material from complying with this Public License. + + +Section 4 -- Sui Generis Database Rights. + +Where the Licensed Rights include Sui Generis Database Rights that +apply to Your use of the Licensed Material: + + a. for the avoidance of doubt, Section 2(a)(1) grants You the right + to extract, reuse, reproduce, and Share all or a substantial + portion of the contents of the database; + + b. if You include all or a substantial portion of the database + contents in a database in which You have Sui Generis Database + Rights, then the database in which You have Sui Generis Database + Rights (but not its individual contents) is Adapted Material; and + + c. You must comply with the conditions in Section 3(a) if You Share + all or a substantial portion of the contents of the database. + +For the avoidance of doubt, this Section 4 supplements and does not +replace Your obligations under this Public License where the Licensed +Rights include other Copyright and Similar Rights. + + +Section 5 -- Disclaimer of Warranties and Limitation of Liability. + + a. UNLESS OTHERWISE SEPARATELY UNDERTAKEN BY THE LICENSOR, TO THE + EXTENT POSSIBLE, THE LICENSOR OFFERS THE LICENSED MATERIAL AS-IS + AND AS-AVAILABLE, AND MAKES NO REPRESENTATIONS OR WARRANTIES OF + ANY KIND CONCERNING THE LICENSED MATERIAL, WHETHER EXPRESS, + IMPLIED, STATUTORY, OR OTHER. THIS INCLUDES, WITHOUT LIMITATION, + WARRANTIES OF TITLE, MERCHANTABILITY, FITNESS FOR A PARTICULAR + PURPOSE, NON-INFRINGEMENT, ABSENCE OF LATENT OR OTHER DEFECTS, + ACCURACY, OR THE PRESENCE OR ABSENCE OF ERRORS, WHETHER OR NOT + KNOWN OR DISCOVERABLE. WHERE DISCLAIMERS OF WARRANTIES ARE NOT + ALLOWED IN FULL OR IN PART, THIS DISCLAIMER MAY NOT APPLY TO YOU. + + b. TO THE EXTENT POSSIBLE, IN NO EVENT WILL THE LICENSOR BE LIABLE + TO YOU ON ANY LEGAL THEORY (INCLUDING, WITHOUT LIMITATION, + NEGLIGENCE) OR OTHERWISE FOR ANY DIRECT, SPECIAL, INDIRECT, + INCIDENTAL, CONSEQUENTIAL, PUNITIVE, EXEMPLARY, OR OTHER LOSSES, + COSTS, EXPENSES, OR DAMAGES ARISING OUT OF THIS PUBLIC LICENSE OR + USE OF THE LICENSED MATERIAL, EVEN IF THE LICENSOR HAS BEEN + ADVISED OF THE POSSIBILITY OF SUCH LOSSES, COSTS, EXPENSES, OR + DAMAGES. WHERE A LIMITATION OF LIABILITY IS NOT ALLOWED IN FULL OR + IN PART, THIS LIMITATION MAY NOT APPLY TO YOU. + + c. The disclaimer of warranties and limitation of liability provided + above shall be interpreted in a manner that, to the extent + possible, most closely approximates an absolute disclaimer and + waiver of all liability. + + +Section 6 -- Term and Termination. + + a. This Public License applies for the term of the Copyright and + Similar Rights licensed here. However, if You fail to comply with + this Public License, then Your rights under this Public License + terminate automatically. + + b. Where Your right to use the Licensed Material has terminated under + Section 6(a), it reinstates: + + 1. automatically as of the date the violation is cured, provided + it is cured within 30 days of Your discovery of the + violation; or + + 2. upon express reinstatement by the Licensor. + + For the avoidance of doubt, this Section 6(b) does not affect any + right the Licensor may have to seek remedies for Your violations + of this Public License. + + c. For the avoidance of doubt, the Licensor may also offer the + Licensed Material under separate terms or conditions or stop + distributing the Licensed Material at any time; however, doing so + will not terminate this Public License. + + d. Sections 1, 5, 6, 7, and 8 survive termination of this Public + License. + + +Section 7 -- Other Terms and Conditions. + + a. The Licensor shall not be bound by any additional or different + terms or conditions communicated by You unless expressly agreed. + + b. Any arrangements, understandings, or agreements regarding the + Licensed Material not stated herein are separate from and + independent of the terms and conditions of this Public License. + + +Section 8 -- Interpretation. + + a. For the avoidance of doubt, this Public License does not, and + shall not be interpreted to, reduce, limit, restrict, or impose + conditions on any use of the Licensed Material that could lawfully + be made without permission under this Public License. + + b. To the extent possible, if any provision of this Public License is + deemed unenforceable, it shall be automatically reformed to the + minimum extent necessary to make it enforceable. If the provision + cannot be reformed, it shall be severed from this Public License + without affecting the enforceability of the remaining terms and + conditions. + + c. No term or condition of this Public License will be waived and no + failure to comply consented to unless expressly agreed to by the + Licensor. + + d. Nothing in this Public License constitutes or may be interpreted + as a limitation upon, or waiver of, any privileges and immunities + that apply to the Licensor or You, including from the legal + processes of any jurisdiction or authority. + + +======================================================================= + +Creative Commons is not a party to its public +licenses. Notwithstanding, Creative Commons may elect to apply one of +its public licenses to material it publishes and in those instances +will be considered the “Licensor.” The text of the Creative Commons +public licenses is dedicated to the public domain under the CC0 Public +Domain Dedication. Except for the limited purpose of indicating that +material is shared under a Creative Commons public license or as +otherwise permitted by the Creative Commons policies published at +creativecommons.org/policies, Creative Commons does not authorize the +use of the trademark "Creative Commons" or any other trademark or logo +of Creative Commons without its prior written consent including, +without limitation, in connection with any unauthorized modifications +to any of its public licenses or any other arrangements, +understandings, or agreements concerning use of licensed material. For +the avoidance of doubt, this paragraph does not form part of the +public licenses. + +Creative Commons may be contacted at creativecommons.org. diff --git a/CHANGELOG.md b/CHANGELOG.md index 0850deed7..cdd5b94a8 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -3,6 +3,17 @@ All notable changes to this project will be documented in this file. The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/). +## unreleased-patch - ??? + +### Added +- Rich media failure tracking (along with `:failure_backoff` option) + +### Fixed +- Mastodon API: Search parameter `following` now correctly returns the followings rather than the followers +- Mastodon API: Timelines hanging for (`number of posts with links * rich media timeout`) in the worst case. + Reduced to just rich media timeout. +- Password resets no longer processed for deactivated accounts + ## [2.1.0] - 2020-08-28 ### Changed diff --git a/Dockerfile b/Dockerfile index aa50e27ec..c210cf79c 100644 --- a/Dockerfile +++ b/Dockerfile @@ -31,7 +31,7 @@ LABEL maintainer="ops@pleroma.social" \ ARG HOME=/opt/pleroma ARG DATA=/var/lib/pleroma -RUN echo "http://nl.alpinelinux.org/alpine/latest-stable/community" >> /etc/apk/repositories &&\ +RUN echo "https://nl.alpinelinux.org/alpine/latest-stable/community" >> /etc/apk/repositories &&\ apk update &&\ apk add exiftool imagemagick ncurses postgresql-client &&\ adduser --system --shell /bin/false --home ${HOME} pleroma &&\ diff --git a/config/config.exs b/config/config.exs index 8caefc9ef..40200c6f6 100644 --- a/config/config.exs +++ b/config/config.exs @@ -415,6 +415,7 @@ Pleroma.Web.RichMedia.Parsers.TwitterCard, Pleroma.Web.RichMedia.Parsers.OEmbed ], + failure_backoff: 60_000, ttl_setters: [Pleroma.Web.RichMedia.Parser.TTL.AwsSignedUrl] config :pleroma, :media_proxy, @@ -743,19 +744,23 @@ config :pleroma, :pools, federation: [ size: 50, - max_waiting: 10 + max_waiting: 10, + timeout: 10_000 ], media: [ size: 50, - max_waiting: 10 + max_waiting: 10, + timeout: 10_000 ], upload: [ size: 25, - max_waiting: 5 + max_waiting: 5, + timeout: 15_000 ], default: [ size: 10, - max_waiting: 2 + max_waiting: 2, + timeout: 5_000 ] config :pleroma, :hackney_pools, diff --git a/config/description.exs b/config/description.exs index 29a657333..5e08ba109 100644 --- a/config/description.exs +++ b/config/description.exs @@ -2385,6 +2385,13 @@ suggestions: [ Pleroma.Web.RichMedia.Parser.TTL.AwsSignedUrl ] + }, + %{ + key: :failure_backoff, + type: :integer, + description: + "Amount of milliseconds after request failure, during which the request will not be retried.", + suggestions: [60_000] } ] }, diff --git a/docs/configuration/cheatsheet.md b/docs/configuration/cheatsheet.md index 2f440adf4..a9a650fab 100644 --- a/docs/configuration/cheatsheet.md +++ b/docs/configuration/cheatsheet.md @@ -361,6 +361,7 @@ config :pleroma, Pleroma.Web.MediaProxy.Invalidation.Http, * `ignore_hosts`: list of hosts which will be ignored by the metadata parser. For example `["accounts.google.com", "xss.website"]`, defaults to `[]`. * `ignore_tld`: list TLDs (top-level domains) which will ignore for parse metadata. default is ["local", "localdomain", "lan"]. * `parsers`: list of Rich Media parsers. +* `failure_backoff`: Amount of milliseconds after request failure, during which the request will not be retried. ## HTTP server diff --git a/docs/installation/freebsd_en.md b/docs/installation/freebsd_en.md index 130d68766..ca2575d9b 100644 --- a/docs/installation/freebsd_en.md +++ b/docs/installation/freebsd_en.md @@ -7,7 +7,7 @@ This document was written for FreeBSD 12.1, but should be work on future release This assumes the target system has `pkg(8)`. ``` -# pkg install elixir postgresql12-server postgresql12-client postgresql12-contrib git-lite sudo nginx gmake acme.sh +# pkg install elixir postgresql12-server postgresql12-client postgresql12-contrib git-lite sudo nginx gmake acme.sh cmake ``` Copy the rc.d scripts to the right directory: diff --git a/lib/mix/tasks/pleroma/frontend.ex b/lib/mix/tasks/pleroma/frontend.ex index 2adbf8d72..1957b1d84 100644 --- a/lib/mix/tasks/pleroma/frontend.ex +++ b/lib/mix/tasks/pleroma/frontend.ex @@ -69,7 +69,7 @@ def run(["install", frontend | args]) do fe_label = "#{frontend} (#{ref})" - tmp_dir = Path.join(dest, "tmp") + tmp_dir = Path.join([instance_static_dir, "frontends", "tmp"]) with {_, :ok} <- {:download_or_unzip, download_or_unzip(frontend_info, tmp_dir, options[:file])}, @@ -124,7 +124,9 @@ defp download_build(frontend_info, dest) do url = String.replace(frontend_info["build_url"], "${ref}", frontend_info["ref"]) with {:ok, %{status: 200, body: zip_body}} <- - Pleroma.HTTP.get(url, [], timeout: 120_000, recv_timeout: 120_000) do + Pleroma.HTTP.get(url, [], + adapter: [pool: :media, timeout: 120_000, recv_timeout: 120_000] + ) do unzip(zip_body, dest) else e -> {:error, e} @@ -133,6 +135,7 @@ defp download_build(frontend_info, dest) do defp install_frontend(frontend_info, source, dest) do from = frontend_info["build_dir"] || "dist" + File.rm_rf!(dest) File.mkdir_p!(dest) File.cp_r!(Path.join([source, from]), dest) :ok diff --git a/lib/pleroma/application.ex b/lib/pleroma/application.ex index c0b5db9f1..33b1e3872 100644 --- a/lib/pleroma/application.ex +++ b/lib/pleroma/application.ex @@ -22,13 +22,18 @@ def named_version, do: @name <> " " <> @version def repository, do: @repository def user_agent do - case Config.get([:http, :user_agent], :default) do - :default -> - info = "#{Pleroma.Web.base_url()} <#{Config.get([:instance, :email], "")}>" - named_version() <> "; " <> info + if Process.whereis(Pleroma.Web.Endpoint) do + case Config.get([:http, :user_agent], :default) do + :default -> + info = "#{Pleroma.Web.base_url()} <#{Config.get([:instance, :email], "")}>" + named_version() <> "; " <> info - custom -> - custom + custom -> + custom + end + else + # fallback, if endpoint is not started yet + "Pleroma Data Loader" end end @@ -39,6 +44,9 @@ def start(_type, _args) do # every time the application is restarted, so we disable module # conflicts at runtime Code.compiler_options(ignore_module_conflict: true) + # Disable warnings_as_errors at runtime, it breaks Phoenix live reload + # due to protocol consolidation warnings + Code.compiler_options(warnings_as_errors: false) Pleroma.Telemetry.Logger.attach() Config.Holder.save_default() Pleroma.HTML.compile_scrubbers() diff --git a/lib/pleroma/ecto_type/activity_pub/object_validators/emoji.ex b/lib/pleroma/ecto_type/activity_pub/object_validators/emoji.ex new file mode 100644 index 000000000..4aacc5c88 --- /dev/null +++ b/lib/pleroma/ecto_type/activity_pub/object_validators/emoji.ex @@ -0,0 +1,34 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2017-2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.EctoType.ActivityPub.ObjectValidators.Emoji do + use Ecto.Type + + def type, do: :map + + def cast(data) when is_map(data) do + has_invalid_emoji? = + Enum.find(data, fn + {name, uri} when is_binary(name) and is_binary(uri) -> + # based on ObjectValidators.Uri.cast() + case URI.parse(uri) do + %URI{host: nil} -> true + %URI{host: ""} -> true + %URI{scheme: scheme} when scheme in ["https", "http"] -> false + _ -> true + end + + {_name, _uri} -> + true + end) + + if has_invalid_emoji?, do: :error, else: {:ok, data} + end + + def cast(_data), do: :error + + def dump(data), do: {:ok, data} + + def load(data), do: {:ok, data} +end diff --git a/lib/pleroma/gun/connection_pool/worker.ex b/lib/pleroma/gun/connection_pool/worker.ex index fec9d0efa..c36332817 100644 --- a/lib/pleroma/gun/connection_pool/worker.ex +++ b/lib/pleroma/gun/connection_pool/worker.ex @@ -83,17 +83,25 @@ def handle_call(:remove_client, {client_pid, _}, %{key: key} = state) do end) {ref, state} = pop_in(state.client_monitors[client_pid]) - Process.demonitor(ref) - - timer = - if used_by == [] do - max_idle = Pleroma.Config.get([:connections_pool, :max_idle_time], 30_000) - Process.send_after(self(), :idle_close, max_idle) + # DOWN message can receive right after `remove_client` call and cause worker to terminate + state = + if is_nil(ref) do + state else - nil + Process.demonitor(ref) + + timer = + if used_by == [] do + max_idle = Pleroma.Config.get([:connections_pool, :max_idle_time], 30_000) + Process.send_after(self(), :idle_close, max_idle) + else + nil + end + + %{state | timer: timer} end - {:reply, :ok, %{state | timer: timer}, :hibernate} + {:reply, :ok, state, :hibernate} end @impl true @@ -103,16 +111,21 @@ def handle_info(:idle_close, state) do {:stop, :normal, state} end + @impl true + def handle_info({:gun_up, _pid, _protocol}, state) do + {:noreply, state, :hibernate} + end + # Gracefully shutdown if the connection got closed without any streams left @impl true def handle_info({:gun_down, _pid, _protocol, _reason, []}, state) do {:stop, :normal, state} end - # Otherwise, shutdown with an error + # Otherwise, wait for retry @impl true - def handle_info({:gun_down, _pid, _protocol, _reason, _killed_streams} = down_message, state) do - {:stop, {:error, down_message}, state} + def handle_info({:gun_down, _pid, _protocol, _reason, _killed_streams}, state) do + {:noreply, state, :hibernate} end @impl true diff --git a/lib/pleroma/html.ex b/lib/pleroma/html.ex index dc1b9b840..20b02f091 100644 --- a/lib/pleroma/html.ex +++ b/lib/pleroma/html.ex @@ -109,8 +109,9 @@ def extract_first_external_url(object, content) do result = content |> Floki.parse_fragment!() - |> Floki.filter_out("a.mention,a.hashtag,a.attachment,a[rel~=\"tag\"]") - |> Floki.attribute("a", "href") + |> Floki.find("a:not(.mention,.hashtag,.attachment,[rel~=\"tag\"])") + |> Enum.take(1) + |> Floki.attribute("href") |> Enum.at(0) {:commit, {:ok, result}} diff --git a/lib/pleroma/http/adapter_helper.ex b/lib/pleroma/http/adapter_helper.ex index 9ec3836b0..d72297323 100644 --- a/lib/pleroma/http/adapter_helper.ex +++ b/lib/pleroma/http/adapter_helper.ex @@ -11,7 +11,6 @@ defmodule Pleroma.HTTP.AdapterHelper do @type proxy_type() :: :socks4 | :socks5 @type host() :: charlist() | :inet.ip_address() - alias Pleroma.Config alias Pleroma.HTTP.AdapterHelper require Logger @@ -20,7 +19,6 @@ defmodule Pleroma.HTTP.AdapterHelper do | {Connection.proxy_type(), Connection.host(), pos_integer()} @callback options(keyword(), URI.t()) :: keyword() - @callback get_conn(URI.t(), keyword()) :: {:ok, term()} | {:error, term()} @spec format_proxy(String.t() | tuple() | nil) :: proxy() | nil def format_proxy(nil), do: nil @@ -44,27 +42,10 @@ def maybe_add_proxy(opts, proxy), do: Keyword.put_new(opts, :proxy, proxy) @spec options(URI.t(), keyword()) :: keyword() def options(%URI{} = uri, opts \\ []) do @defaults - |> put_timeout() |> Keyword.merge(opts) |> adapter_helper().options(uri) end - # For Hackney, this is the time a connection can stay idle in the pool. - # For Gun, this is the timeout to receive a message from Gun. - defp put_timeout(opts) do - {config_key, default} = - if adapter() == Tesla.Adapter.Gun do - {:pools, Config.get([:pools, :default, :timeout], 5_000)} - else - {:hackney_pools, 10_000} - end - - timeout = Config.get([config_key, opts[:pool], :timeout], default) - - Keyword.merge(opts, timeout: timeout) - end - - def get_conn(uri, opts), do: adapter_helper().get_conn(uri, opts) defp adapter, do: Application.get_env(:tesla, :adapter) defp adapter_helper do diff --git a/lib/pleroma/http/adapter_helper/gun.ex b/lib/pleroma/http/adapter_helper/gun.ex index b4ff8306c..4a967d8f2 100644 --- a/lib/pleroma/http/adapter_helper/gun.ex +++ b/lib/pleroma/http/adapter_helper/gun.ex @@ -5,7 +5,7 @@ defmodule Pleroma.HTTP.AdapterHelper.Gun do @behaviour Pleroma.HTTP.AdapterHelper - alias Pleroma.Gun.ConnectionPool + alias Pleroma.Config alias Pleroma.HTTP.AdapterHelper require Logger @@ -14,48 +14,55 @@ defmodule Pleroma.HTTP.AdapterHelper.Gun do connect_timeout: 5_000, domain_lookup_timeout: 5_000, tls_handshake_timeout: 5_000, - retry: 0, + retry: 1, retry_timeout: 1000, await_up_timeout: 5_000 ] + @type pool() :: :federation | :upload | :media | :default + @spec options(keyword(), URI.t()) :: keyword() def options(incoming_opts \\ [], %URI{} = uri) do proxy = - Pleroma.Config.get([:http, :proxy_url]) + [:http, :proxy_url] + |> Config.get() |> AdapterHelper.format_proxy() - config_opts = Pleroma.Config.get([:http, :adapter], []) + config_opts = Config.get([:http, :adapter], []) @defaults |> Keyword.merge(config_opts) |> add_scheme_opts(uri) |> AdapterHelper.maybe_add_proxy(proxy) |> Keyword.merge(incoming_opts) + |> put_timeout() end defp add_scheme_opts(opts, %{scheme: "http"}), do: opts defp add_scheme_opts(opts, %{scheme: "https"}) do - opts - |> Keyword.put(:certificates_verification, true) + Keyword.put(opts, :certificates_verification, true) end - @spec get_conn(URI.t(), keyword()) :: {:ok, keyword()} | {:error, atom()} - def get_conn(uri, opts) do - case ConnectionPool.get_conn(uri, opts) do - {:ok, conn_pid} -> {:ok, Keyword.merge(opts, conn: conn_pid, close_conn: false)} - err -> err - end + defp put_timeout(opts) do + # this is the timeout to receive a message from Gun + Keyword.put_new(opts, :timeout, pool_timeout(opts[:pool])) + end + + @spec pool_timeout(pool()) :: non_neg_integer() + def pool_timeout(pool) do + default = Config.get([:pools, :default, :timeout], 5_000) + + Config.get([:pools, pool, :timeout], default) end @prefix Pleroma.Gun.ConnectionPool def limiter_setup do - wait = Pleroma.Config.get([:connections_pool, :connection_acquisition_wait]) - retries = Pleroma.Config.get([:connections_pool, :connection_acquisition_retries]) + wait = Config.get([:connections_pool, :connection_acquisition_wait]) + retries = Config.get([:connections_pool, :connection_acquisition_retries]) :pools - |> Pleroma.Config.get([]) + |> Config.get([]) |> Enum.each(fn {name, opts} -> max_running = Keyword.get(opts, :size, 50) max_waiting = Keyword.get(opts, :max_waiting, 10) @@ -69,7 +76,6 @@ def limiter_setup do case result do :ok -> :ok {:error, :existing} -> :ok - e -> raise e end end) diff --git a/lib/pleroma/http/adapter_helper/hackney.ex b/lib/pleroma/http/adapter_helper/hackney.ex index cd569422b..f47a671ad 100644 --- a/lib/pleroma/http/adapter_helper/hackney.ex +++ b/lib/pleroma/http/adapter_helper/hackney.ex @@ -23,7 +23,4 @@ def options(connection_opts \\ [], %URI{} = uri) do end defp add_scheme_opts(opts, _), do: opts - - @spec get_conn(URI.t(), keyword()) :: {:ok, keyword()} - def get_conn(_uri, opts), do: {:ok, opts} end diff --git a/lib/pleroma/http/ex_aws.ex b/lib/pleroma/http/ex_aws.ex index e53e64077..c3f335c73 100644 --- a/lib/pleroma/http/ex_aws.ex +++ b/lib/pleroma/http/ex_aws.ex @@ -11,6 +11,8 @@ defmodule Pleroma.HTTP.ExAws do @impl true def request(method, url, body \\ "", headers \\ [], http_opts \\ []) do + http_opts = Keyword.put_new(http_opts, :adapter, pool: :upload) + case HTTP.request(method, url, body, headers, http_opts) do {:ok, env} -> {:ok, %{status_code: env.status, headers: env.headers, body: env.body}} diff --git a/lib/pleroma/http/http.ex b/lib/pleroma/http/http.ex index b37b3fa89..7bc73f4a0 100644 --- a/lib/pleroma/http/http.ex +++ b/lib/pleroma/http/http.ex @@ -62,28 +62,21 @@ def request(method, url, body, headers, options) when is_binary(url) do uri = URI.parse(url) adapter_opts = AdapterHelper.options(uri, options[:adapter] || []) - case AdapterHelper.get_conn(uri, adapter_opts) do - {:ok, adapter_opts} -> - options = put_in(options[:adapter], adapter_opts) - params = options[:params] || [] - request = build_request(method, headers, options, url, body, params) + options = put_in(options[:adapter], adapter_opts) + params = options[:params] || [] + request = build_request(method, headers, options, url, body, params) - adapter = Application.get_env(:tesla, :adapter) + adapter = Application.get_env(:tesla, :adapter) - client = Tesla.client(adapter_middlewares(adapter), adapter) + client = Tesla.client(adapter_middlewares(adapter), adapter) - maybe_limit( - fn -> - request(client, request) - end, - adapter, - adapter_opts - ) - - # Connection release is handled in a custom FollowRedirects middleware - err -> - err - end + maybe_limit( + fn -> + request(client, request) + end, + adapter, + adapter_opts + ) end @spec request(Client.t(), keyword()) :: {:ok, Env.t()} | {:error, any()} @@ -110,7 +103,7 @@ defp maybe_limit(fun, _, _) do end defp adapter_middlewares(Tesla.Adapter.Gun) do - [Pleroma.HTTP.Middleware.FollowRedirects] + [Tesla.Middleware.FollowRedirects, Pleroma.Tesla.Middleware.ConnectionPool] end defp adapter_middlewares(_), do: [] diff --git a/lib/pleroma/http/tzdata.ex b/lib/pleroma/http/tzdata.ex index 34bb253a7..4539ac359 100644 --- a/lib/pleroma/http/tzdata.ex +++ b/lib/pleroma/http/tzdata.ex @@ -11,6 +11,8 @@ defmodule Pleroma.HTTP.Tzdata do @impl true def get(url, headers, options) do + options = Keyword.put_new(options, :adapter, pool: :default) + with {:ok, %Tesla.Env{} = env} <- HTTP.get(url, headers, options) do {:ok, {env.status, env.headers, env.body}} end @@ -18,6 +20,8 @@ def get(url, headers, options) do @impl true def head(url, headers, options) do + options = Keyword.put_new(options, :adapter, pool: :default) + with {:ok, %Tesla.Env{} = env} <- HTTP.head(url, headers, options) do {:ok, {env.status, env.headers}} end diff --git a/lib/pleroma/instances/instance.ex b/lib/pleroma/instances/instance.ex index a1f935232..711c42158 100644 --- a/lib/pleroma/instances/instance.ex +++ b/lib/pleroma/instances/instance.ex @@ -150,7 +150,9 @@ def get_or_update_favicon(%URI{host: host} = instance_uri) do defp scrape_favicon(%URI{} = instance_uri) do try do with {:ok, %Tesla.Env{body: html}} <- - Pleroma.HTTP.get(to_string(instance_uri), [{:Accept, "text/html"}]), + Pleroma.HTTP.get(to_string(instance_uri), [{"accept", "text/html"}], + adapter: [pool: :media] + ), favicon_rel <- html |> Floki.parse_document!() diff --git a/lib/pleroma/notification.ex b/lib/pleroma/notification.ex index c1825f810..8868a910e 100644 --- a/lib/pleroma/notification.ex +++ b/lib/pleroma/notification.ex @@ -648,4 +648,16 @@ def for_user_and_activity(user, activity) do ) |> Repo.one() end + + @spec mark_context_as_read(User.t(), String.t()) :: {integer(), nil | [term()]} + def mark_context_as_read(%User{id: id}, context) do + from( + n in Notification, + join: a in assoc(n, :activity), + where: n.user_id == ^id, + where: n.seen == false, + where: fragment("?->>'context'", a.data) == ^context + ) + |> Repo.update_all(set: [seen: true]) + end end diff --git a/lib/pleroma/object/fetcher.ex b/lib/pleroma/object/fetcher.ex index 6fdbc8efd..1de2ce6c3 100644 --- a/lib/pleroma/object/fetcher.ex +++ b/lib/pleroma/object/fetcher.ex @@ -36,8 +36,7 @@ defp maybe_reinject_internal_fields(_, new_data), do: new_data defp reinject_object(%Object{data: %{"type" => "Question"}} = object, new_data) do Logger.debug("Reinjecting object #{new_data["id"]}") - with new_data <- Transmogrifier.fix_object(new_data), - data <- maybe_reinject_internal_fields(object, new_data), + with data <- maybe_reinject_internal_fields(object, new_data), {:ok, data, _} <- ObjectValidator.validate(data, %{}), changeset <- Object.change(object, %{data: data}), changeset <- touch_changeset(changeset), @@ -164,12 +163,12 @@ defp make_signature(id, date) do date: date }) - [{"signature", signature}] + {"signature", signature} end defp sign_fetch(headers, id, date) do if Pleroma.Config.get([:activitypub, :sign_object_fetches]) do - headers ++ make_signature(id, date) + [make_signature(id, date) | headers] else headers end @@ -177,7 +176,7 @@ defp sign_fetch(headers, id, date) do defp maybe_date_fetch(headers, date) do if Pleroma.Config.get([:activitypub, :sign_object_fetches]) do - headers ++ [{"date", date}] + [{"date", date} | headers] else headers end diff --git a/lib/pleroma/tesla/middleware/connection_pool.ex b/lib/pleroma/tesla/middleware/connection_pool.ex new file mode 100644 index 000000000..056e736ce --- /dev/null +++ b/lib/pleroma/tesla/middleware/connection_pool.ex @@ -0,0 +1,50 @@ +# Pleroma: A lightweight social networking server +# Copyright © 2020 Pleroma Authors +# SPDX-License-Identifier: AGPL-3.0-only + +defmodule Pleroma.Tesla.Middleware.ConnectionPool do + @moduledoc """ + Middleware to get/release connections from `Pleroma.Gun.ConnectionPool` + """ + + @behaviour Tesla.Middleware + + alias Pleroma.Gun.ConnectionPool + + @impl Tesla.Middleware + def call(%Tesla.Env{url: url, opts: opts} = env, next, _) do + uri = URI.parse(url) + + # Avoid leaking connections when the middleware is called twice + # with body_as: :chunks. We assume only the middleware can set + # opts[:adapter][:conn] + if opts[:adapter][:conn] do + ConnectionPool.release_conn(opts[:adapter][:conn]) + end + + case ConnectionPool.get_conn(uri, opts[:adapter]) do + {:ok, conn_pid} -> + adapter_opts = Keyword.merge(opts[:adapter], conn: conn_pid, close_conn: false) + opts = Keyword.put(opts, :adapter, adapter_opts) + env = %{env | opts: opts} + + case Tesla.run(env, next) do + {:ok, env} -> + unless opts[:adapter][:body_as] == :chunks do + ConnectionPool.release_conn(conn_pid) + {_, res} = pop_in(env.opts[:adapter][:conn]) + {:ok, res} + else + {:ok, env} + end + + err -> + ConnectionPool.release_conn(conn_pid) + err + end + + err -> + err + end + end +end diff --git a/lib/pleroma/tesla/middleware/follow_redirects.ex b/lib/pleroma/tesla/middleware/follow_redirects.ex deleted file mode 100644 index 5a7032215..000000000 --- a/lib/pleroma/tesla/middleware/follow_redirects.ex +++ /dev/null @@ -1,110 +0,0 @@ -# Pleroma: A lightweight social networking server -# Copyright © 2015-2020 Tymon Tobolski -# Copyright © 2020 Pleroma Authors -# SPDX-License-Identifier: AGPL-3.0-only - -defmodule Pleroma.HTTP.Middleware.FollowRedirects do - @moduledoc """ - Pool-aware version of https://github.com/teamon/tesla/blob/master/lib/tesla/middleware/follow_redirects.ex - - Follow 3xx redirects - ## Options - - `:max_redirects` - limit number of redirects (default: `5`) - """ - - alias Pleroma.Gun.ConnectionPool - - @behaviour Tesla.Middleware - - @max_redirects 5 - @redirect_statuses [301, 302, 303, 307, 308] - - @impl Tesla.Middleware - def call(env, next, opts \\ []) do - max = Keyword.get(opts, :max_redirects, @max_redirects) - - redirect(env, next, max) - end - - defp redirect(env, next, left) do - opts = env.opts[:adapter] - - case Tesla.run(env, next) do - {:ok, %{status: status} = res} when status in @redirect_statuses and left > 0 -> - release_conn(opts) - - case Tesla.get_header(res, "location") do - nil -> - {:ok, res} - - location -> - location = parse_location(location, res) - - case get_conn(location, opts) do - {:ok, opts} -> - %{env | opts: Keyword.put(env.opts, :adapter, opts)} - |> new_request(res.status, location) - |> redirect(next, left - 1) - - e -> - e - end - end - - {:ok, %{status: status}} when status in @redirect_statuses -> - release_conn(opts) - {:error, {__MODULE__, :too_many_redirects}} - - {:error, _} = e -> - release_conn(opts) - e - - other -> - unless opts[:body_as] == :chunks do - release_conn(opts) - end - - other - end - end - - defp get_conn(location, opts) do - uri = URI.parse(location) - - case ConnectionPool.get_conn(uri, opts) do - {:ok, conn} -> - {:ok, Keyword.merge(opts, conn: conn)} - - e -> - e - end - end - - defp release_conn(opts) do - ConnectionPool.release_conn(opts[:conn]) - end - - # The 303 (See Other) redirect was added in HTTP/1.1 to indicate that the originally - # requested resource is not available, however a related resource (or another redirect) - # available via GET is available at the specified location. - # https://tools.ietf.org/html/rfc7231#section-6.4.4 - defp new_request(env, 303, location), do: %{env | url: location, method: :get, query: []} - - # The 307 (Temporary Redirect) status code indicates that the target - # resource resides temporarily under a different URI and the user agent - # MUST NOT change the request method (...) - # https://tools.ietf.org/html/rfc7231#section-6.4.7 - defp new_request(env, 307, location), do: %{env | url: location} - - defp new_request(env, _, location), do: %{env | url: location, query: []} - - defp parse_location("https://" <> _rest = location, _env), do: location - defp parse_location("http://" <> _rest = location, _env), do: location - - defp parse_location(location, env) do - env.url - |> URI.parse() - |> URI.merge(location) - |> URI.to_string() - end -end diff --git a/lib/pleroma/uploaders/s3.ex b/lib/pleroma/uploaders/s3.ex index a13ff23b6..6dbef9085 100644 --- a/lib/pleroma/uploaders/s3.ex +++ b/lib/pleroma/uploaders/s3.ex @@ -46,12 +46,23 @@ def put_file(%Pleroma.Upload{} = upload) do op = if streaming do - upload.tempfile - |> ExAws.S3.Upload.stream_file() - |> ExAws.S3.upload(bucket, s3_name, [ - {:acl, :public_read}, - {:content_type, upload.content_type} - ]) + op = + upload.tempfile + |> ExAws.S3.Upload.stream_file() + |> ExAws.S3.upload(bucket, s3_name, [ + {:acl, :public_read}, + {:content_type, upload.content_type} + ]) + + if Application.get_env(:tesla, :adapter) == Tesla.Adapter.Gun do + # set s3 upload timeout to respect :upload pool timeout + # timeout should be slightly larger, so s3 can retry upload on fail + timeout = Pleroma.HTTP.AdapterHelper.Gun.pool_timeout(:upload) + 1_000 + opts = Keyword.put(op.opts, :timeout, timeout) + Map.put(op, :opts, opts) + else + op + end else {:ok, file_data} = File.read(upload.tempfile) diff --git a/lib/pleroma/user.ex b/lib/pleroma/user.ex index d2ad9516f..94c96de8d 100644 --- a/lib/pleroma/user.ex +++ b/lib/pleroma/user.ex @@ -83,7 +83,7 @@ defmodule Pleroma.User do ] schema "users" do - field(:bio, :string) + field(:bio, :string, default: "") field(:raw_bio, :string) field(:email, :string) field(:name, :string) @@ -1587,7 +1587,7 @@ def purge_user_changeset(user) do # "Right to be forgotten" # https://gdpr.eu/right-to-be-forgotten/ change(user, %{ - bio: nil, + bio: "", raw_bio: nil, email: nil, name: nil, diff --git a/lib/pleroma/user/search.ex b/lib/pleroma/user/search.ex index d4fd31069..adbef7fb8 100644 --- a/lib/pleroma/user/search.ex +++ b/lib/pleroma/user/search.ex @@ -116,7 +116,7 @@ defp trigram_rank(query, query_string) do end defp base_query(_user, false), do: User - defp base_query(user, true), do: User.get_followers_query(user) + defp base_query(user, true), do: User.get_friends_query(user) defp filter_invisible_users(query) do from(q in query, where: q.invisible == false) diff --git a/lib/pleroma/web/activity_pub/activity_pub.ex b/lib/pleroma/web/activity_pub/activity_pub.ex index 624a508ae..333621413 100644 --- a/lib/pleroma/web/activity_pub/activity_pub.ex +++ b/lib/pleroma/web/activity_pub/activity_pub.ex @@ -1224,7 +1224,7 @@ defp object_to_user_data(data) do name: data["name"], follower_address: data["followers"], following_address: data["following"], - bio: data["summary"], + bio: data["summary"] || "", actor_type: actor_type, also_known_as: Map.get(data, "alsoKnownAs", []), public_key: public_key, diff --git a/lib/pleroma/web/activity_pub/object_validators/audio_validator.ex b/lib/pleroma/web/activity_pub/object_validators/audio_validator.ex index d1869f188..1a97c504a 100644 --- a/lib/pleroma/web/activity_pub/object_validators/audio_validator.ex +++ b/lib/pleroma/web/activity_pub/object_validators/audio_validator.ex @@ -9,6 +9,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.AudioValidator do alias Pleroma.Web.ActivityPub.ObjectValidators.AttachmentValidator alias Pleroma.Web.ActivityPub.ObjectValidators.CommonFixes alias Pleroma.Web.ActivityPub.ObjectValidators.CommonValidations + alias Pleroma.Web.ActivityPub.Transmogrifier import Ecto.Changeset @@ -33,8 +34,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.AudioValidator do field(:attributedTo, ObjectValidators.ObjectID) field(:summary, :string) field(:published, ObjectValidators.DateTime) - # TODO: Write type - field(:emoji, :map, default: %{}) + field(:emoji, ObjectValidators.Emoji, default: %{}) field(:sensitive, :boolean, default: false) embeds_many(:attachment, AttachmentValidator) field(:replies_count, :integer, default: 0) @@ -83,6 +83,7 @@ defp fix(data) do data |> CommonFixes.fix_defaults() |> CommonFixes.fix_attribution() + |> Transmogrifier.fix_emoji() |> fix_url() end diff --git a/lib/pleroma/web/activity_pub/object_validators/chat_message_validator.ex b/lib/pleroma/web/activity_pub/object_validators/chat_message_validator.ex index 91b475393..6acd4a771 100644 --- a/lib/pleroma/web/activity_pub/object_validators/chat_message_validator.ex +++ b/lib/pleroma/web/activity_pub/object_validators/chat_message_validator.ex @@ -22,7 +22,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.ChatMessageValidator do field(:content, ObjectValidators.SafeText) field(:actor, ObjectValidators.ObjectID) field(:published, ObjectValidators.DateTime) - field(:emoji, :map, default: %{}) + field(:emoji, ObjectValidators.Emoji, default: %{}) embeds_one(:attachment, AttachmentValidator) end diff --git a/lib/pleroma/web/activity_pub/object_validators/common_fixes.ex b/lib/pleroma/web/activity_pub/object_validators/common_fixes.ex index 721749de0..720213d73 100644 --- a/lib/pleroma/web/activity_pub/object_validators/common_fixes.ex +++ b/lib/pleroma/web/activity_pub/object_validators/common_fixes.ex @@ -11,8 +11,8 @@ def fix_defaults(data) do Utils.create_context(data["context"] || data["conversation"]) data - |> Map.put_new("context", context) - |> Map.put_new("context_id", context_id) + |> Map.put("context", context) + |> Map.put("context_id", context_id) end def fix_attribution(data) do diff --git a/lib/pleroma/web/activity_pub/object_validators/event_validator.ex b/lib/pleroma/web/activity_pub/object_validators/event_validator.ex index 07e4821a4..0b4c99dc0 100644 --- a/lib/pleroma/web/activity_pub/object_validators/event_validator.ex +++ b/lib/pleroma/web/activity_pub/object_validators/event_validator.ex @@ -9,6 +9,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.EventValidator do alias Pleroma.Web.ActivityPub.ObjectValidators.AttachmentValidator alias Pleroma.Web.ActivityPub.ObjectValidators.CommonFixes alias Pleroma.Web.ActivityPub.ObjectValidators.CommonValidations + alias Pleroma.Web.ActivityPub.Transmogrifier import Ecto.Changeset @@ -39,8 +40,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.EventValidator do field(:attributedTo, ObjectValidators.ObjectID) field(:published, ObjectValidators.DateTime) - # TODO: Write type - field(:emoji, :map, default: %{}) + field(:emoji, ObjectValidators.Emoji, default: %{}) field(:sensitive, :boolean, default: false) embeds_many(:attachment, AttachmentValidator) field(:replies_count, :integer, default: 0) @@ -74,6 +74,7 @@ defp fix(data) do data |> CommonFixes.fix_defaults() |> CommonFixes.fix_attribution() + |> Transmogrifier.fix_emoji() end def changeset(struct, data) do diff --git a/lib/pleroma/web/activity_pub/object_validators/note_validator.ex b/lib/pleroma/web/activity_pub/object_validators/note_validator.ex index 20e735619..ab4469a59 100644 --- a/lib/pleroma/web/activity_pub/object_validators/note_validator.ex +++ b/lib/pleroma/web/activity_pub/object_validators/note_validator.ex @@ -6,6 +6,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.NoteValidator do use Ecto.Schema alias Pleroma.EctoType.ActivityPub.ObjectValidators + alias Pleroma.Web.ActivityPub.Transmogrifier import Ecto.Changeset @@ -32,8 +33,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.NoteValidator do field(:actor, ObjectValidators.ObjectID) field(:attributedTo, ObjectValidators.ObjectID) field(:published, ObjectValidators.DateTime) - # TODO: Write type - field(:emoji, :map, default: %{}) + field(:emoji, ObjectValidators.Emoji, default: %{}) field(:sensitive, :boolean, default: false) # TODO: Write type field(:attachment, {:array, :map}, default: []) @@ -53,7 +53,14 @@ def cast_and_validate(data) do |> validate_data() end + defp fix(data) do + data + |> Transmogrifier.fix_emoji() + end + def cast_data(data) do + data = fix(data) + %__MODULE__{} |> cast(data, __schema__(:fields)) end diff --git a/lib/pleroma/web/activity_pub/object_validators/question_validator.ex b/lib/pleroma/web/activity_pub/object_validators/question_validator.ex index 712047424..934d3c1ea 100644 --- a/lib/pleroma/web/activity_pub/object_validators/question_validator.ex +++ b/lib/pleroma/web/activity_pub/object_validators/question_validator.ex @@ -10,6 +10,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.QuestionValidator do alias Pleroma.Web.ActivityPub.ObjectValidators.CommonFixes alias Pleroma.Web.ActivityPub.ObjectValidators.CommonValidations alias Pleroma.Web.ActivityPub.ObjectValidators.QuestionOptionsValidator + alias Pleroma.Web.ActivityPub.Transmogrifier import Ecto.Changeset @@ -35,8 +36,7 @@ defmodule Pleroma.Web.ActivityPub.ObjectValidators.QuestionValidator do field(:attributedTo, ObjectValidators.ObjectID) field(:summary, :string) field(:published, ObjectValidators.DateTime) - # TODO: Write type - field(:emoji, :map, default: %{}) + field(:emoji, ObjectValidators.Emoji, default: %{}) field(:sensitive, :boolean, default: false) embeds_many(:attachment, AttachmentValidator) field(:replies_count, :integer, default: 0) @@ -85,6 +85,7 @@ defp fix(data) do data |> CommonFixes.fix_defaults() |> CommonFixes.fix_attribution() + |> Transmogrifier.fix_emoji() |> fix_closed() end diff --git a/lib/pleroma/web/activity_pub/transmogrifier.ex b/lib/pleroma/web/activity_pub/transmogrifier.ex index 76298c4a0..0831efadc 100644 --- a/lib/pleroma/web/activity_pub/transmogrifier.ex +++ b/lib/pleroma/web/activity_pub/transmogrifier.ex @@ -318,9 +318,6 @@ def fix_emoji(%{"tag" => tags} = object) when is_list(tags) do Map.put(mapping, name, data["icon"]["url"]) end) - # we merge mastodon and pleroma emoji into a single mapping, to allow for both wire formats - emoji = Map.merge(object["emoji"] || %{}, emoji) - Map.put(object, "emoji", emoji) end diff --git a/lib/pleroma/web/api_spec/operations/list_operation.ex b/lib/pleroma/web/api_spec/operations/list_operation.ex index c88ed5dd0..15039052e 100644 --- a/lib/pleroma/web/api_spec/operations/list_operation.ex +++ b/lib/pleroma/web/api_spec/operations/list_operation.ex @@ -114,7 +114,7 @@ def add_to_list_operation do description: "Add accounts to the given list.", operationId: "ListController.add_to_list", parameters: [id_param()], - requestBody: add_remove_accounts_request(), + requestBody: add_remove_accounts_request(true), security: [%{"oAuth" => ["write:lists"]}], responses: %{ 200 => Operation.response("Empty object", "application/json", %Schema{type: :object}) @@ -127,8 +127,16 @@ def remove_from_list_operation do tags: ["Lists"], summary: "Remove accounts from list", operationId: "ListController.remove_from_list", - parameters: [id_param()], - requestBody: add_remove_accounts_request(), + parameters: [ + id_param(), + Operation.parameter( + :account_ids, + :query, + %Schema{type: :array, items: %Schema{type: :string}}, + "Array of account IDs" + ) + ], + requestBody: add_remove_accounts_request(false), security: [%{"oAuth" => ["write:lists"]}], responses: %{ 200 => Operation.response("Empty object", "application/json", %Schema{type: :object}) @@ -171,7 +179,7 @@ defp create_update_request do ) end - defp add_remove_accounts_request do + defp add_remove_accounts_request(required) when is_boolean(required) do request_body( "Parameters", %Schema{ @@ -180,9 +188,9 @@ defp add_remove_accounts_request do properties: %{ account_ids: %Schema{type: :array, description: "Array of account IDs", items: FlakeID} }, - required: [:account_ids] + required: required && [:account_ids] }, - required: true + required: required ) end end diff --git a/lib/pleroma/web/auth/pleroma_authenticator.ex b/lib/pleroma/web/auth/pleroma_authenticator.ex index 200ca03dc..c611b3e09 100644 --- a/lib/pleroma/web/auth/pleroma_authenticator.ex +++ b/lib/pleroma/web/auth/pleroma_authenticator.ex @@ -68,7 +68,7 @@ def create_from_registration( nickname = value([registration_attrs["nickname"], Registration.nickname(registration)]) email = value([registration_attrs["email"], Registration.email(registration)]) name = value([registration_attrs["name"], Registration.name(registration)]) || nickname - bio = value([registration_attrs["bio"], Registration.description(registration)]) + bio = value([registration_attrs["bio"], Registration.description(registration)]) || "" random_password = :crypto.strong_rand_bytes(64) |> Base.encode64() diff --git a/lib/pleroma/web/common_api/common_api.ex b/lib/pleroma/web/common_api/common_api.ex index 5ad2b91c2..4ab533658 100644 --- a/lib/pleroma/web/common_api/common_api.ex +++ b/lib/pleroma/web/common_api/common_api.ex @@ -452,7 +452,8 @@ def unpin(id, user) do end def add_mute(user, activity) do - with {:ok, _} <- ThreadMute.add_mute(user.id, activity.data["context"]) do + with {:ok, _} <- ThreadMute.add_mute(user.id, activity.data["context"]), + _ <- Pleroma.Notification.mark_context_as_read(user, activity.data["context"]) do {:ok, activity} else {:error, _} -> {:error, dgettext("errors", "conversation is already muted")} diff --git a/lib/pleroma/web/mastodon_api/controllers/auth_controller.ex b/lib/pleroma/web/mastodon_api/controllers/auth_controller.ex index 753b3db3e..9f09550e1 100644 --- a/lib/pleroma/web/mastodon_api/controllers/auth_controller.ex +++ b/lib/pleroma/web/mastodon_api/controllers/auth_controller.ex @@ -59,17 +59,11 @@ def logout(conn, _) do def password_reset(conn, params) do nickname_or_email = params["email"] || params["nickname"] - with {:ok, _} <- TwitterAPI.password_reset(nickname_or_email) do - conn - |> put_status(:no_content) - |> json("") - else - {:error, "unknown user"} -> - send_resp(conn, :not_found, "") + TwitterAPI.password_reset(nickname_or_email) - {:error, _} -> - send_resp(conn, :bad_request, "") - end + conn + |> put_status(:no_content) + |> json("") end defp local_mastodon_root_path(conn) do diff --git a/lib/pleroma/web/mastodon_api/controllers/list_controller.ex b/lib/pleroma/web/mastodon_api/controllers/list_controller.ex index acdc76fd2..5daeaa780 100644 --- a/lib/pleroma/web/mastodon_api/controllers/list_controller.ex +++ b/lib/pleroma/web/mastodon_api/controllers/list_controller.ex @@ -74,7 +74,7 @@ def add_to_list(%{assigns: %{list: list}, body_params: %{account_ids: account_id # DELETE /api/v1/lists/:id/accounts def remove_from_list( - %{assigns: %{list: list}, body_params: %{account_ids: account_ids}} = conn, + %{assigns: %{list: list}, params: %{account_ids: account_ids}} = conn, _ ) do Enum.each(account_ids, fn account_id -> @@ -86,6 +86,10 @@ def remove_from_list( json(conn, %{}) end + def remove_from_list(%{body_params: params} = conn, _) do + remove_from_list(%{conn | params: params}, %{}) + end + defp list_by_id_and_user(%{assigns: %{user: user}, params: %{id: id}} = conn, _) do case Pleroma.List.get(id, user) do %Pleroma.List{} = list -> assign(conn, :list, list) diff --git a/lib/pleroma/web/mastodon_api/views/account_view.ex b/lib/pleroma/web/mastodon_api/views/account_view.ex index 864c0417f..d2a30a548 100644 --- a/lib/pleroma/web/mastodon_api/views/account_view.ex +++ b/lib/pleroma/web/mastodon_api/views/account_view.ex @@ -245,7 +245,7 @@ defp do_render("show.json", %{user: user} = opts) do followers_count: followers_count, following_count: following_count, statuses_count: user.note_count, - note: user.bio || "", + note: user.bio, url: user.uri || user.ap_id, avatar: image, avatar_static: image, diff --git a/lib/pleroma/web/mastodon_api/views/status_view.ex b/lib/pleroma/web/mastodon_api/views/status_view.ex index 01b8bb6bb..3fe1967be 100644 --- a/lib/pleroma/web/mastodon_api/views/status_view.ex +++ b/lib/pleroma/web/mastodon_api/views/status_view.ex @@ -23,6 +23,17 @@ defmodule Pleroma.Web.MastodonAPI.StatusView do import Pleroma.Web.ActivityPub.Visibility, only: [get_visibility: 1, visible_for_user?: 2] + # This is a naive way to do this, just spawning a process per activity + # to fetch the preview. However it should be fine considering + # pagination is restricted to 40 activities at a time + defp fetch_rich_media_for_activities(activities) do + Enum.each(activities, fn activity -> + spawn(fn -> + Pleroma.Web.RichMedia.Helpers.fetch_data_for_activity(activity) + end) + end) + end + # TODO: Add cached version. defp get_replied_to_activities([]), do: %{} @@ -80,6 +91,11 @@ def render("index.json", opts) do # To do: check AdminAPIControllerTest on the reasons behind nil activities in the list activities = Enum.filter(opts.activities, & &1) + + # Start fetching rich media before doing anything else, so that later calls to get the cards + # only block for timeout in the worst case, as opposed to + # length(activities_with_links) * timeout + fetch_rich_media_for_activities(activities) replied_to_activities = get_replied_to_activities(activities) parent_activities = diff --git a/lib/pleroma/web/metadata/opengraph.ex b/lib/pleroma/web/metadata/opengraph.ex index 68c871e71..bb1b23208 100644 --- a/lib/pleroma/web/metadata/opengraph.ex +++ b/lib/pleroma/web/metadata/opengraph.ex @@ -61,7 +61,7 @@ def build_tags(%{ @impl Provider def build_tags(%{user: user}) do - with truncated_bio = Utils.scrub_html_and_truncate(user.bio || "") do + with truncated_bio = Utils.scrub_html_and_truncate(user.bio) do [ {:meta, [ diff --git a/lib/pleroma/web/metadata/twitter_card.ex b/lib/pleroma/web/metadata/twitter_card.ex index 5d08ce422..df34b033f 100644 --- a/lib/pleroma/web/metadata/twitter_card.ex +++ b/lib/pleroma/web/metadata/twitter_card.ex @@ -40,7 +40,7 @@ def build_tags(%{activity_id: id, object: object, user: user}) do @impl Provider def build_tags(%{user: user}) do - with truncated_bio = Utils.scrub_html_and_truncate(user.bio || "") do + with truncated_bio = Utils.scrub_html_and_truncate(user.bio) do [ title_tag(user), {:meta, [property: "twitter:description", content: truncated_bio], []}, diff --git a/lib/pleroma/web/pleroma_api/controllers/chat_controller.ex b/lib/pleroma/web/pleroma_api/controllers/chat_controller.ex index 1f2e953f7..e8a1746d4 100644 --- a/lib/pleroma/web/pleroma_api/controllers/chat_controller.ex +++ b/lib/pleroma/web/pleroma_api/controllers/chat_controller.ex @@ -149,9 +149,7 @@ def index(%{assigns: %{user: %{id: user_id} = user}} = conn, _params) do from(c in Chat, where: c.user_id == ^user_id, where: c.recipient not in ^blocked_ap_ids, - order_by: [desc: c.updated_at], - inner_join: u in User, - on: u.ap_id == c.recipient + order_by: [desc: c.updated_at] ) |> Repo.all() diff --git a/lib/pleroma/web/rich_media/helpers.ex b/lib/pleroma/web/rich_media/helpers.ex index 6210f2c5a..2fb482b51 100644 --- a/lib/pleroma/web/rich_media/helpers.ex +++ b/lib/pleroma/web/rich_media/helpers.ex @@ -96,6 +96,6 @@ def rich_media_get(url) do @rich_media_options end - Pleroma.HTTP.get(url, headers, options) + Pleroma.HTTP.get(url, headers, adapter: options) end end diff --git a/lib/pleroma/web/rich_media/parser.ex b/lib/pleroma/web/rich_media/parser.ex index ca592833f..5727fda18 100644 --- a/lib/pleroma/web/rich_media/parser.ex +++ b/lib/pleroma/web/rich_media/parser.ex @@ -3,6 +3,8 @@ # SPDX-License-Identifier: AGPL-3.0-only defmodule Pleroma.Web.RichMedia.Parser do + require Logger + defp parsers do Pleroma.Config.get([:rich_media, :parsers]) end @@ -10,17 +12,34 @@ defp parsers do def parse(nil), do: {:error, "No URL provided"} if Pleroma.Config.get(:env) == :test do + @spec parse(String.t()) :: {:ok, map()} | {:error, any()} def parse(url), do: parse_url(url) else + @spec parse(String.t()) :: {:ok, map()} | {:error, any()} def parse(url) do - try do - Cachex.fetch!(:rich_media_cache, url, fn _ -> - {:commit, parse_url(url)} - end) - |> set_ttl_based_on_image(url) - rescue - e -> - {:error, "Cachex error: #{inspect(e)}"} + with {:ok, data} <- get_cached_or_parse(url), + {:ok, _} <- set_ttl_based_on_image(data, url) do + {:ok, data} + else + {:error, {:invalid_metadata, data}} = e -> + Logger.debug(fn -> "Incomplete or invalid metadata for #{url}: #{inspect(data)}" end) + e + + error -> + Logger.error(fn -> "Rich media error for #{url}: #{inspect(error)}" end) + error + end + end + + defp get_cached_or_parse(url) do + case Cachex.fetch!(:rich_media_cache, url, fn _ -> {:commit, parse_url(url)} end) do + {:ok, _data} = res -> + res + + {:error, _} = e -> + ttl = Pleroma.Config.get([:rich_media, :failure_backoff], 60_000) + Cachex.expire(:rich_media_cache, url, ttl) + e end end end @@ -47,19 +66,26 @@ def ttl(data, url) do config :pleroma, :rich_media, ttl_setters: [MyModule] """ - def set_ttl_based_on_image({:ok, data}, url) do - with {:ok, nil} <- Cachex.ttl(:rich_media_cache, url), - ttl when is_number(ttl) <- get_ttl_from_image(data, url) do - Cachex.expire_at(:rich_media_cache, url, ttl * 1000) - {:ok, data} - else + @spec set_ttl_based_on_image(map(), String.t()) :: + {:ok, Integer.t() | :noop} | {:error, :no_key} + def set_ttl_based_on_image(data, url) do + case get_ttl_from_image(data, url) do + {:ok, ttl} when is_number(ttl) -> + ttl = ttl * 1000 + + case Cachex.expire_at(:rich_media_cache, url, ttl) do + {:ok, true} -> {:ok, ttl} + {:ok, false} -> {:error, :no_key} + end + _ -> - {:ok, data} + {:ok, :noop} end end defp get_ttl_from_image(data, url) do - Pleroma.Config.get([:rich_media, :ttl_setters]) + [:rich_media, :ttl_setters] + |> Pleroma.Config.get() |> Enum.reduce({:ok, nil}, fn module, {:ok, _ttl} -> module.ttl(data, url) @@ -69,24 +95,17 @@ defp get_ttl_from_image(data, url) do end) end - defp parse_url(url) do - try do - {:ok, %Tesla.Env{body: html}} = Pleroma.Web.RichMedia.Helpers.rich_media_get(url) - + def parse_url(url) do + with {:ok, %Tesla.Env{body: html}} <- Pleroma.Web.RichMedia.Helpers.rich_media_get(url), + {:ok, html} <- Floki.parse_document(html) do html - |> parse_html() |> maybe_parse() |> Map.put("url", url) |> clean_parsed_data() |> check_parsed_data() - rescue - e -> - {:error, "Parsing error: #{inspect(e)} #{inspect(__STACKTRACE__)}"} end end - defp parse_html(html), do: Floki.parse_document!(html) - defp maybe_parse(html) do Enum.reduce_while(parsers(), %{}, fn parser, acc -> case parser.parse(html, acc) do @@ -102,7 +121,7 @@ defp check_parsed_data(%{"title" => title} = data) end defp check_parsed_data(data) do - {:error, "Found metadata was invalid or incomplete: #{inspect(data)}"} + {:error, {:invalid_metadata, data}} end defp clean_parsed_data(data) do diff --git a/lib/pleroma/web/rich_media/parsers/ttl/aws_signed_url.ex b/lib/pleroma/web/rich_media/parsers/ttl/aws_signed_url.ex index 0dc1efdaf..c5aaea2d4 100644 --- a/lib/pleroma/web/rich_media/parsers/ttl/aws_signed_url.ex +++ b/lib/pleroma/web/rich_media/parsers/ttl/aws_signed_url.ex @@ -10,20 +10,15 @@ def ttl(data, _url) do |> parse_query_params() |> format_query_params() |> get_expiration_timestamp() + else + {:error, "Not aws signed url #{inspect(image)}"} end end - defp is_aws_signed_url(""), do: nil - defp is_aws_signed_url(nil), do: nil - - defp is_aws_signed_url(image) when is_binary(image) do + defp is_aws_signed_url(image) when is_binary(image) and image != "" do %URI{host: host, query: query} = URI.parse(image) - if String.contains?(host, "amazonaws.com") and String.contains?(query, "X-Amz-Expires") do - image - else - nil - end + String.contains?(host, "amazonaws.com") and String.contains?(query, "X-Amz-Expires") end defp is_aws_signed_url(_), do: nil @@ -46,6 +41,6 @@ defp get_expiration_timestamp(params) when is_map(params) do |> Map.get("X-Amz-Date") |> Timex.parse("{ISO:Basic:Z}") - Timex.to_unix(date) + String.to_integer(Map.get(params, "X-Amz-Expires")) + {:ok, Timex.to_unix(date) + String.to_integer(Map.get(params, "X-Amz-Expires"))} end end diff --git a/lib/pleroma/web/twitter_api/twitter_api.ex b/lib/pleroma/web/twitter_api/twitter_api.ex index 2294d9d0d..5d7948507 100644 --- a/lib/pleroma/web/twitter_api/twitter_api.ex +++ b/lib/pleroma/web/twitter_api/twitter_api.ex @@ -72,7 +72,7 @@ defp maybe_notify_admins(%User{} = account) do def password_reset(nickname_or_email) do with true <- is_binary(nickname_or_email), - %User{local: true, email: email} = user when is_binary(email) <- + %User{local: true, email: email, deactivated: false} = user when is_binary(email) <- User.get_by_nickname_or_email(nickname_or_email), {:ok, token_record} <- Pleroma.PasswordResetToken.create_token(user) do user @@ -81,17 +81,8 @@ def password_reset(nickname_or_email) do {:ok, :enqueued} else - false -> - {:error, "bad user identifier"} - - %User{local: true, email: nil} -> + _ -> {:ok, :noop} - - %User{local: false} -> - {:error, "remote user"} - - nil -> - {:error, "unknown user"} end end diff --git a/lib/pleroma/web/web_finger/web_finger.ex b/lib/pleroma/web/web_finger/web_finger.ex index c4051e63e..6629f5356 100644 --- a/lib/pleroma/web/web_finger/web_finger.ex +++ b/lib/pleroma/web/web_finger/web_finger.ex @@ -136,12 +136,12 @@ def get_template_from_xml(body) do def find_lrdd_template(domain) do with {:ok, %{status: status, body: body}} when status in 200..299 <- - HTTP.get("http://#{domain}/.well-known/host-meta", []) do + HTTP.get("http://#{domain}/.well-known/host-meta") do get_template_from_xml(body) else _ -> with {:ok, %{body: body, status: status}} when status in 200..299 <- - HTTP.get("https://#{domain}/.well-known/host-meta", []) do + HTTP.get("https://#{domain}/.well-known/host-meta") do get_template_from_xml(body) else e -> {:error, "Can't find LRDD template: #{inspect(e)}"} diff --git a/mix.exs b/mix.exs index 4de0c78db..c324960c5 100644 --- a/mix.exs +++ b/mix.exs @@ -134,7 +134,9 @@ defp deps do {:cachex, "~> 3.2"}, {:poison, "~> 3.0", override: true}, {:tesla, - github: "teamon/tesla", ref: "af3707078b10793f6a534938e56b963aff82fe3c", override: true}, + git: "https://git.pleroma.social/pleroma/elixir-libraries/tesla.git", + ref: "3a2789d8535f7b520ebbadc4494227e5ba0e5365", + override: true}, {:castore, "~> 0.1"}, {:cowlib, "~> 2.9", override: true}, {:gun, diff --git a/mix.lock b/mix.lock index 86d0a75d7..deb07eb68 100644 --- a/mix.lock +++ b/mix.lock @@ -42,8 +42,8 @@ "ex_machina": {:hex, :ex_machina, "2.4.0", "09a34c5d371bfb5f78399029194a8ff67aff340ebe8ba19040181af35315eabb", [:mix], [{:ecto, "~> 2.2 or ~> 3.0", [hex: :ecto, repo: "hexpm", optional: true]}, {:ecto_sql, "~> 3.0", [hex: :ecto_sql, repo: "hexpm", optional: true]}], "hexpm", "a20bc9ddc721b33ea913b93666c5d0bdca5cbad7a67540784ae277228832d72c"}, "ex_syslogger": {:hex, :ex_syslogger, "1.5.2", "72b6aa2d47a236e999171f2e1ec18698740f40af0bd02c8c650bf5f1fd1bac79", [:mix], [{:poison, ">= 1.5.0", [hex: :poison, repo: "hexpm", optional: true]}, {:syslog, "~> 1.1.0", [hex: :syslog, repo: "hexpm", optional: false]}], "hexpm", "ab9fab4136dbc62651ec6f16fa4842f10cf02ab4433fa3d0976c01be99398399"}, "excoveralls": {:hex, :excoveralls, "0.13.1", "b9f1697f7c9e0cfe15d1a1d737fb169c398803ffcbc57e672aa007e9fd42864c", [:mix], [{:hackney, "~> 1.16", [hex: :hackney, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}], "hexpm", "b4bb550e045def1b4d531a37fb766cbbe1307f7628bf8f0414168b3f52021cce"}, - "fast_html": {:hex, :fast_html, "2.0.2", "1fabc408b2baa965cf6399a48796326f2721b21b397a3c667bb3bb88fb9559a4", [:make, :mix], [{:elixir_make, "~> 0.4", [hex: :elixir_make, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 0.1.0", [hex: :nimble_pool, repo: "hexpm", optional: false]}], "hexpm", "f077e2c1597a6e2678e6cacc64f456a6c6024eb4240092c46d4212496dc59aba"}, - "fast_sanitize": {:hex, :fast_sanitize, "0.2.1", "3302421a988992b6cae08e68f77069e167ff116444183f3302e3c36017a50558", [:mix], [{:fast_html, "~> 2.0", [hex: :fast_html, repo: "hexpm", optional: false]}, {:plug, "~> 1.8", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "bcd2c54e328128515edd1a8fb032fdea7e5581672ba161fc5962d21ecee92502"}, + "fast_html": {:hex, :fast_html, "2.0.4", "4910ee49f2f6b19692e3bf30bf97f1b6b7dac489cd6b0f34cd0fe3042c56ba30", [:make, :mix], [{:elixir_make, "~> 0.4", [hex: :elixir_make, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 0.1.0", [hex: :nimble_pool, repo: "hexpm", optional: false]}], "hexpm", "3bb49d541dfc02ad5e425904f53376d758c09f89e521afc7d2b174b3227761ea"}, + "fast_sanitize": {:hex, :fast_sanitize, "0.2.2", "3cbbaebaea6043865dfb5b4ecb0f1af066ad410a51470e353714b10c42007b81", [:mix], [{:fast_html, "~> 2.0", [hex: :fast_html, repo: "hexpm", optional: false]}, {:plug, "~> 1.8", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "69f204db9250afa94a0d559d9110139850f57de2b081719fbafa1e9a89e94466"}, "flake_id": {:hex, :flake_id, "0.1.0", "7716b086d2e405d09b647121a166498a0d93d1a623bead243e1f74216079ccb3", [:mix], [{:base62, "~> 1.2", [hex: :base62, repo: "hexpm", optional: false]}, {:ecto, ">= 2.0.0", [hex: :ecto, repo: "hexpm", optional: true]}], "hexpm", "31fc8090fde1acd267c07c36ea7365b8604055f897d3a53dd967658c691bd827"}, "floki": {:hex, :floki, "0.27.0", "6b29a14283f1e2e8fad824bc930eaa9477c462022075df6bea8f0ad811c13599", [:mix], [{:html_entities, "~> 0.5.0", [hex: :html_entities, repo: "hexpm", optional: false]}], "hexpm", "583b8c13697c37179f1f82443bcc7ad2f76fbc0bf4c186606eebd658f7f2631b"}, "gen_smtp": {:hex, :gen_smtp, "0.15.0", "9f51960c17769b26833b50df0b96123605a8024738b62db747fece14eb2fbfcc", [:rebar3], [], "hexpm", "29bd14a88030980849c7ed2447b8db6d6c9278a28b11a44cafe41b791205440f"}, @@ -110,7 +110,7 @@ "swoosh": {:hex, :swoosh, "1.0.0", "c547cfc83f30e12d5d1fdcb623d7de2c2e29a5becfc68bf8f42ba4d23d2c2756", [:mix], [{:cowboy, "~> 1.0.1 or ~> 1.1 or ~> 2.4", [hex: :cowboy, repo: "hexpm", optional: true]}, {:gen_smtp, "~> 0.13", [hex: :gen_smtp, repo: "hexpm", optional: true]}, {:hackney, "~> 1.9", [hex: :hackney, repo: "hexpm", optional: true]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: false]}, {:mail, "~> 0.2", [hex: :mail, repo: "hexpm", optional: true]}, {:mime, "~> 1.1", [hex: :mime, repo: "hexpm", optional: false]}, {:plug_cowboy, ">= 1.0.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}], "hexpm", "b3b08e463f876cb6167f7168e9ad99a069a724e124bcee61847e0e1ed13f4a0d"}, "syslog": {:hex, :syslog, "1.1.0", "6419a232bea84f07b56dc575225007ffe34d9fdc91abe6f1b2f254fd71d8efc2", [:rebar3], [], "hexpm", "4c6a41373c7e20587be33ef841d3de6f3beba08519809329ecc4d27b15b659e1"}, "telemetry": {:hex, :telemetry, "0.4.2", "2808c992455e08d6177322f14d3bdb6b625fbcfd233a73505870d8738a2f4599", [:rebar3], [], "hexpm", "2d1419bd9dda6a206d7b5852179511722e2b18812310d304620c7bd92a13fcef"}, - "tesla": {:git, "https://github.com/teamon/tesla.git", "af3707078b10793f6a534938e56b963aff82fe3c", [ref: "af3707078b10793f6a534938e56b963aff82fe3c"]}, + "tesla": {:git, "https://git.pleroma.social/pleroma/elixir-libraries/tesla.git", "3a2789d8535f7b520ebbadc4494227e5ba0e5365", [ref: "3a2789d8535f7b520ebbadc4494227e5ba0e5365"]}, "timex": {:hex, :timex, "3.6.2", "845cdeb6119e2fef10751c0b247b6c59d86d78554c83f78db612e3290f819bc2", [:mix], [{:combine, "~> 0.10", [hex: :combine, repo: "hexpm", optional: false]}, {:gettext, "~> 0.10", [hex: :gettext, repo: "hexpm", optional: false]}, {:tzdata, "~> 0.1.8 or ~> 0.5 or ~> 1.0.0", [hex: :tzdata, repo: "hexpm", optional: false]}], "hexpm", "26030b46199d02a590be61c2394b37ea25a3664c02fafbeca0b24c972025d47a"}, "trailing_format_plug": {:hex, :trailing_format_plug, "0.0.7", "64b877f912cf7273bed03379936df39894149e35137ac9509117e59866e10e45", [:mix], [{:plug, "> 0.12.0", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "bd4fde4c15f3e993a999e019d64347489b91b7a9096af68b2bdadd192afa693f"}, "tzdata": {:hex, :tzdata, "1.0.3", "73470ad29dde46e350c60a66e6b360d3b99d2d18b74c4c349dbebbc27a09a3eb", [:mix], [{:hackney, "~> 1.0", [hex: :hackney, repo: "hexpm", optional: false]}], "hexpm", "a6e1ee7003c4d04ecbd21dd3ec690d4c6662db5d3bbdd7262d53cdf5e7c746c1"}, diff --git a/priv/repo/migrations/20200831142509_chat_constraints.exs b/priv/repo/migrations/20200831142509_chat_constraints.exs new file mode 100644 index 000000000..868a40a45 --- /dev/null +++ b/priv/repo/migrations/20200831142509_chat_constraints.exs @@ -0,0 +1,22 @@ +defmodule Pleroma.Repo.Migrations.ChatConstraints do + use Ecto.Migration + + def change do + remove_orphans = """ + delete from chats where not exists(select id from users where ap_id = chats.recipient); + """ + + execute(remove_orphans) + + drop(constraint(:chats, "chats_user_id_fkey")) + + alter table(:chats) do + modify(:user_id, references(:users, type: :uuid, on_delete: :delete_all)) + + modify( + :recipient, + references(:users, column: :ap_id, type: :string, on_delete: :delete_all) + ) + end + end +end diff --git a/priv/repo/migrations/20200901061256_ensure_bio_is_string.exs b/priv/repo/migrations/20200901061256_ensure_bio_is_string.exs new file mode 100644 index 000000000..0e3bb3c81 --- /dev/null +++ b/priv/repo/migrations/20200901061256_ensure_bio_is_string.exs @@ -0,0 +1,7 @@ +defmodule Pleroma.Repo.Migrations.EnsureBioIsString do + use Ecto.Migration + + def change do + execute("update users set bio = '' where bio is null", "") + end +end diff --git a/priv/repo/migrations/20200901061637_bio_set_not_null.exs b/priv/repo/migrations/20200901061637_bio_set_not_null.exs new file mode 100644 index 000000000..e3a67d4e7 --- /dev/null +++ b/priv/repo/migrations/20200901061637_bio_set_not_null.exs @@ -0,0 +1,10 @@ +defmodule Pleroma.Repo.Migrations.BioSetNotNull do + use Ecto.Migration + + def change do + execute( + "alter table users alter column bio set not null", + "alter table users alter column bio drop not null" + ) + end +end diff --git a/test/chat_test.exs b/test/chat_test.exs index 332f2180a..9e8a9ebf0 100644 --- a/test/chat_test.exs +++ b/test/chat_test.exs @@ -26,6 +26,28 @@ test "it creates a chat for a user and recipient" do assert chat.id end + test "deleting the user deletes the chat" do + user = insert(:user) + other_user = insert(:user) + + {:ok, chat} = Chat.bump_or_create(user.id, other_user.ap_id) + + Repo.delete(user) + + refute Chat.get_by_id(chat.id) + end + + test "deleting the recipient deletes the chat" do + user = insert(:user) + other_user = insert(:user) + + {:ok, chat} = Chat.bump_or_create(user.id, other_user.ap_id) + + Repo.delete(other_user) + + refute Chat.get_by_id(chat.id) + end + test "it returns and bumps a chat for a user and recipient if it already exists" do user = insert(:user) other_user = insert(:user) diff --git a/test/support/http_request_mock.ex b/test/support/http_request_mock.ex index eeeba7880..a0ebf65d9 100644 --- a/test/support/http_request_mock.ex +++ b/test/support/http_request_mock.ex @@ -1350,11 +1350,11 @@ def get("https://relay.mastodon.host/actor", _, _, _) do {:ok, %Tesla.Env{status: 200, body: File.read!("test/fixtures/relay/relay.json")}} end - def get("http://localhost:4001/", _, "", Accept: "text/html") do + def get("http://localhost:4001/", _, "", [{"accept", "text/html"}]) do {:ok, %Tesla.Env{status: 200, body: File.read!("test/fixtures/tesla_mock/7369654.html")}} end - def get("https://osada.macgirvin.com/", _, "", Accept: "text/html") do + def get("https://osada.macgirvin.com/", _, "", [{"accept", "text/html"}]) do {:ok, %Tesla.Env{ status: 200, diff --git a/test/tasks/frontend_test.exs b/test/tasks/frontend_test.exs index 0ca2b9a28..022ae51be 100644 --- a/test/tasks/frontend_test.exs +++ b/test/tasks/frontend_test.exs @@ -48,11 +48,18 @@ test "it also works given a file" do } }) + folder = Path.join([@dir, "frontends", "pleroma", "fantasy"]) + previously_existing = Path.join([folder, "temp"]) + File.mkdir_p!(folder) + File.write!(previously_existing, "yey") + assert File.exists?(previously_existing) + capture_io(fn -> Frontend.run(["install", "pleroma", "--file", "test/fixtures/tesla_mock/frontend.zip"]) end) - assert File.exists?(Path.join([@dir, "frontends", "pleroma", "fantasy", "test.txt"])) + assert File.exists?(Path.join([folder, "test.txt"])) + refute File.exists?(previously_existing) end test "it downloads and unzips unknown frontends" do diff --git a/test/user_search_test.exs b/test/user_search_test.exs index 559ba5966..01976bf58 100644 --- a/test/user_search_test.exs +++ b/test/user_search_test.exs @@ -109,22 +109,22 @@ test "finds users, boosting ranks of friends and followers" do Enum.map(User.search("doe", resolve: false, for_user: u1), & &1.id) == [] end - test "finds followers of user by partial name" do - u1 = insert(:user) - u2 = insert(:user, %{name: "Jimi"}) - follower_jimi = insert(:user, %{name: "Jimi Hendrix"}) - follower_lizz = insert(:user, %{name: "Lizz Wright"}) - friend = insert(:user, %{name: "Jimi"}) + test "finds followings of user by partial name" do + lizz = insert(:user, %{name: "Lizz"}) + jimi = insert(:user, %{name: "Jimi"}) + following_lizz = insert(:user, %{name: "Jimi Hendrix"}) + following_jimi = insert(:user, %{name: "Lizz Wright"}) + follower_lizz = insert(:user, %{name: "Jimi"}) - {:ok, follower_jimi} = User.follow(follower_jimi, u1) - {:ok, _follower_lizz} = User.follow(follower_lizz, u2) - {:ok, u1} = User.follow(u1, friend) + {:ok, lizz} = User.follow(lizz, following_lizz) + {:ok, _jimi} = User.follow(jimi, following_jimi) + {:ok, _follower_lizz} = User.follow(follower_lizz, lizz) - assert Enum.map(User.search("jimi", following: true, for_user: u1), & &1.id) == [ - follower_jimi.id + assert Enum.map(User.search("jimi", following: true, for_user: lizz), & &1.id) == [ + following_lizz.id ] - assert User.search("lizz", following: true, for_user: u1) == [] + assert User.search("lizz", following: true, for_user: lizz) == [] end test "find local and remote users for authenticated users" do diff --git a/test/user_test.exs b/test/user_test.exs index 3cf248659..50f72549e 100644 --- a/test/user_test.exs +++ b/test/user_test.exs @@ -1466,7 +1466,7 @@ test "delete/1 purges a user when they wouldn't be fully deleted" do user = User.get_by_id(user.id) assert %User{ - bio: nil, + bio: "", raw_bio: nil, email: nil, name: nil, diff --git a/test/web/activity_pub/object_validators/chat_validation_test.exs b/test/web/activity_pub/object_validators/chat_validation_test.exs index 50bf03515..16e4808e5 100644 --- a/test/web/activity_pub/object_validators/chat_validation_test.exs +++ b/test/web/activity_pub/object_validators/chat_validation_test.exs @@ -69,6 +69,7 @@ test "validates for a basic object we build", %{valid_chat_message: valid_chat_m assert {:ok, object, _meta} = ObjectValidator.validate(valid_chat_message, []) assert Map.put(valid_chat_message, "attachment", nil) == object + assert match?(%{"firefox" => _}, object["emoji"]) end test "validates for a basic object with an attachment", %{ diff --git a/test/web/activity_pub/transmogrifier/question_handling_test.exs b/test/web/activity_pub/transmogrifier/question_handling_test.exs index c82361828..74ee79543 100644 --- a/test/web/activity_pub/transmogrifier/question_handling_test.exs +++ b/test/web/activity_pub/transmogrifier/question_handling_test.exs @@ -106,6 +106,57 @@ test "Mastodon Question activity with HTML tags in plaintext" do assert Enum.sort(object.data["oneOf"]) == Enum.sort(options) end + test "Mastodon Question activity with custom emojis" do + options = [ + %{ + "type" => "Note", + "name" => ":blobcat:", + "replies" => %{"totalItems" => 0, "type" => "Collection"} + }, + %{ + "type" => "Note", + "name" => ":blobfox:", + "replies" => %{"totalItems" => 0, "type" => "Collection"} + } + ] + + tag = [ + %{ + "icon" => %{ + "type" => "Image", + "url" => "https://blob.cat/emoji/custom/blobcats/blobcat.png" + }, + "id" => "https://blob.cat/emoji/custom/blobcats/blobcat.png", + "name" => ":blobcat:", + "type" => "Emoji", + "updated" => "1970-01-01T00:00:00Z" + }, + %{ + "icon" => %{"type" => "Image", "url" => "https://blob.cat/emoji/blobfox/blobfox.png"}, + "id" => "https://blob.cat/emoji/blobfox/blobfox.png", + "name" => ":blobfox:", + "type" => "Emoji", + "updated" => "1970-01-01T00:00:00Z" + } + ] + + data = + File.read!("test/fixtures/mastodon-question-activity.json") + |> Poison.decode!() + |> Kernel.put_in(["object", "oneOf"], options) + |> Kernel.put_in(["object", "tag"], tag) + + {:ok, %Activity{local: false} = activity} = Transmogrifier.handle_incoming(data) + object = Object.normalize(activity, false) + + assert object.data["oneOf"] == options + + assert object.data["emoji"] == %{ + "blobcat" => "https://blob.cat/emoji/custom/blobcats/blobcat.png", + "blobfox" => "https://blob.cat/emoji/blobfox/blobfox.png" + } + end + test "returns an error if received a second time" do data = File.read!("test/fixtures/mastodon-question-activity.json") |> Poison.decode!() diff --git a/test/web/admin_api/controllers/admin_api_controller_test.exs b/test/web/admin_api/controllers/admin_api_controller_test.exs index dbf478edf..3bc88c6a9 100644 --- a/test/web/admin_api/controllers/admin_api_controller_test.exs +++ b/test/web/admin_api/controllers/admin_api_controller_test.exs @@ -203,7 +203,7 @@ test "single user", %{admin: admin, conn: conn} do assert user.note_count == 0 assert user.follower_count == 0 assert user.following_count == 0 - assert user.bio == nil + assert user.bio == "" assert user.name == nil assert called(Pleroma.Web.Federator.publish(:_)) diff --git a/test/web/common_api/common_api_test.exs b/test/web/common_api/common_api_test.exs index 4ba6232dc..800db9a20 100644 --- a/test/web/common_api/common_api_test.exs +++ b/test/web/common_api/common_api_test.exs @@ -9,6 +9,7 @@ defmodule Pleroma.Web.CommonAPITest do alias Pleroma.Conversation.Participation alias Pleroma.Notification alias Pleroma.Object + alias Pleroma.Repo alias Pleroma.User alias Pleroma.Web.ActivityPub.ActivityPub alias Pleroma.Web.ActivityPub.Transmogrifier @@ -18,6 +19,7 @@ defmodule Pleroma.Web.CommonAPITest do import Pleroma.Factory import Mock + import Ecto.Query, only: [from: 2] require Pleroma.Constants @@ -808,6 +810,69 @@ test "should unpin when deleting a status", %{user: user, activity: activity} do [user: user, activity: activity] end + test "marks notifications as read after mute" do + author = insert(:user) + activity = insert(:note_activity, user: author) + + friend1 = insert(:user) + friend2 = insert(:user) + + {:ok, reply_activity} = + CommonAPI.post( + friend2, + %{ + status: "@#{author.nickname} @#{friend1.nickname} test reply", + in_reply_to_status_id: activity.id + } + ) + + {:ok, favorite_activity} = CommonAPI.favorite(friend2, activity.id) + {:ok, repeat_activity} = CommonAPI.repeat(activity.id, friend1) + + assert Repo.aggregate( + from(n in Notification, where: n.seen == false and n.user_id == ^friend1.id), + :count + ) == 1 + + unread_notifications = + Repo.all(from(n in Notification, where: n.seen == false, where: n.user_id == ^author.id)) + + assert Enum.any?(unread_notifications, fn n -> + n.type == "favourite" && n.activity_id == favorite_activity.id + end) + + assert Enum.any?(unread_notifications, fn n -> + n.type == "reblog" && n.activity_id == repeat_activity.id + end) + + assert Enum.any?(unread_notifications, fn n -> + n.type == "mention" && n.activity_id == reply_activity.id + end) + + {:ok, _} = CommonAPI.add_mute(author, activity) + assert CommonAPI.thread_muted?(author, activity) + + assert Repo.aggregate( + from(n in Notification, where: n.seen == false and n.user_id == ^friend1.id), + :count + ) == 1 + + read_notifications = + Repo.all(from(n in Notification, where: n.seen == true, where: n.user_id == ^author.id)) + + assert Enum.any?(read_notifications, fn n -> + n.type == "favourite" && n.activity_id == favorite_activity.id + end) + + assert Enum.any?(read_notifications, fn n -> + n.type == "reblog" && n.activity_id == repeat_activity.id + end) + + assert Enum.any?(read_notifications, fn n -> + n.type == "mention" && n.activity_id == reply_activity.id + end) + end + test "add mute", %{user: user, activity: activity} do {:ok, _} = CommonAPI.add_mute(user, activity) assert CommonAPI.thread_muted?(user, activity) diff --git a/test/web/mastodon_api/controllers/auth_controller_test.exs b/test/web/mastodon_api/controllers/auth_controller_test.exs index a485f8e41..4fa95fce1 100644 --- a/test/web/mastodon_api/controllers/auth_controller_test.exs +++ b/test/web/mastodon_api/controllers/auth_controller_test.exs @@ -122,17 +122,27 @@ test "it doesn't fail when a user has no email", %{conn: conn} do {:ok, user: user} end - test "it returns 404 when user is not found", %{conn: conn, user: user} do + test "it returns 204 when user is not found", %{conn: conn, user: user} do conn = post(conn, "/auth/password?email=nonexisting_#{user.email}") - assert conn.status == 404 - assert conn.resp_body == "" + + assert conn + |> json_response(:no_content) end - test "it returns 400 when user is not local", %{conn: conn, user: user} do + test "it returns 204 when user is not local", %{conn: conn, user: user} do {:ok, user} = Repo.update(Ecto.Changeset.change(user, local: false)) conn = post(conn, "/auth/password?email=#{user.email}") - assert conn.status == 400 - assert conn.resp_body == "" + + assert conn + |> json_response(:no_content) + end + + test "it returns 204 when user is deactivated", %{conn: conn, user: user} do + {:ok, user} = Repo.update(Ecto.Changeset.change(user, deactivated: true, local: true)) + conn = post(conn, "/auth/password?email=#{user.email}") + + assert conn + |> json_response(:no_content) end end diff --git a/test/web/mastodon_api/controllers/list_controller_test.exs b/test/web/mastodon_api/controllers/list_controller_test.exs index 57a9ef4a4..091ec006c 100644 --- a/test/web/mastodon_api/controllers/list_controller_test.exs +++ b/test/web/mastodon_api/controllers/list_controller_test.exs @@ -67,7 +67,7 @@ test "adding users to a list" do assert following == [other_user.follower_address] end - test "removing users from a list" do + test "removing users from a list, body params" do %{user: user, conn: conn} = oauth_access(["write:lists"]) other_user = insert(:user) third_user = insert(:user) @@ -85,6 +85,24 @@ test "removing users from a list" do assert following == [third_user.follower_address] end + test "removing users from a list, query params" do + %{user: user, conn: conn} = oauth_access(["write:lists"]) + other_user = insert(:user) + third_user = insert(:user) + {:ok, list} = Pleroma.List.create("name", user) + {:ok, list} = Pleroma.List.follow(list, other_user) + {:ok, list} = Pleroma.List.follow(list, third_user) + + assert %{} == + conn + |> put_req_header("content-type", "application/json") + |> delete("/api/v1/lists/#{list.id}/accounts?account_ids[]=#{other_user.id}") + |> json_response_and_validate_schema(:ok) + + %Pleroma.List{following: following} = Pleroma.List.get(list.id, user) + assert following == [third_user.follower_address] + end + test "listing users in a list" do %{user: user, conn: conn} = oauth_access(["read:lists"]) other_user = insert(:user) diff --git a/test/web/rich_media/aws_signed_url_test.exs b/test/web/rich_media/aws_signed_url_test.exs index b30f4400e..1ceae1a31 100644 --- a/test/web/rich_media/aws_signed_url_test.exs +++ b/test/web/rich_media/aws_signed_url_test.exs @@ -21,7 +21,7 @@ test "s3 signed url is parsed correct for expiration time" do expire_time = Timex.parse!(timestamp, "{ISO:Basic:Z}") |> Timex.to_unix() |> Kernel.+(valid_till) - assert expire_time == Pleroma.Web.RichMedia.Parser.TTL.AwsSignedUrl.ttl(metadata, url) + assert {:ok, expire_time} == Pleroma.Web.RichMedia.Parser.TTL.AwsSignedUrl.ttl(metadata, url) end test "s3 signed url is parsed and correct ttl is set for rich media" do @@ -55,7 +55,7 @@ test "s3 signed url is parsed and correct ttl is set for rich media" do Cachex.put(:rich_media_cache, url, metadata) - Pleroma.Web.RichMedia.Parser.set_ttl_based_on_image({:ok, metadata}, url) + Pleroma.Web.RichMedia.Parser.set_ttl_based_on_image(metadata, url) {:ok, cache_ttl} = Cachex.ttl(:rich_media_cache, url) diff --git a/test/web/rich_media/parser_test.exs b/test/web/rich_media/parser_test.exs index 420a612c6..21ae35f8b 100644 --- a/test/web/rich_media/parser_test.exs +++ b/test/web/rich_media/parser_test.exs @@ -5,6 +5,8 @@ defmodule Pleroma.Web.RichMedia.ParserTest do use ExUnit.Case, async: true + alias Pleroma.Web.RichMedia.Parser + setup do Tesla.Mock.mock(fn %{ @@ -48,23 +50,27 @@ defmodule Pleroma.Web.RichMedia.ParserTest do %{method: :get, url: "http://example.com/empty"} -> %Tesla.Env{status: 200, body: "hello"} + + %{method: :get, url: "http://example.com/malformed"} -> + %Tesla.Env{status: 200, body: File.read!("test/fixtures/rich_media/malformed-data.html")} + + %{method: :get, url: "http://example.com/error"} -> + {:error, :overload} end) :ok end test "returns error when no metadata present" do - assert {:error, _} = Pleroma.Web.RichMedia.Parser.parse("http://example.com/empty") + assert {:error, _} = Parser.parse("http://example.com/empty") end test "doesn't just add a title" do - assert Pleroma.Web.RichMedia.Parser.parse("http://example.com/non-ogp") == - {:error, - "Found metadata was invalid or incomplete: %{\"url\" => \"http://example.com/non-ogp\"}"} + assert {:error, {:invalid_metadata, _}} = Parser.parse("http://example.com/non-ogp") end test "parses ogp" do - assert Pleroma.Web.RichMedia.Parser.parse("http://example.com/ogp") == + assert Parser.parse("http://example.com/ogp") == {:ok, %{ "image" => "http://ia.media-imdb.com/images/rock.jpg", @@ -77,7 +83,7 @@ test "parses ogp" do end test "falls back to when ogp:title is missing" do - assert Pleroma.Web.RichMedia.Parser.parse("http://example.com/ogp-missing-title") == + assert Parser.parse("http://example.com/ogp-missing-title") == {:ok, %{ "image" => "http://ia.media-imdb.com/images/rock.jpg", @@ -90,7 +96,7 @@ test "falls back to <title> when ogp:title is missing" do end test "parses twitter card" do - assert Pleroma.Web.RichMedia.Parser.parse("http://example.com/twitter-card") == + assert Parser.parse("http://example.com/twitter-card") == {:ok, %{ "card" => "summary", @@ -103,7 +109,7 @@ test "parses twitter card" do end test "parses OEmbed" do - assert Pleroma.Web.RichMedia.Parser.parse("http://example.com/oembed") == + assert Parser.parse("http://example.com/oembed") == {:ok, %{ "author_name" => "‮‭‬bees‬", @@ -132,6 +138,10 @@ test "parses OEmbed" do end test "rejects invalid OGP data" do - assert {:error, _} = Pleroma.Web.RichMedia.Parser.parse("http://example.com/malformed") + assert {:error, _} = Parser.parse("http://example.com/malformed") + end + + test "returns error if getting page was not successful" do + assert {:error, :overload} = Parser.parse("http://example.com/error") end end diff --git a/test/web/twitter_api/util_controller_test.exs b/test/web/twitter_api/util_controller_test.exs index 354d77b56..d164127ee 100644 --- a/test/web/twitter_api/util_controller_test.exs +++ b/test/web/twitter_api/util_controller_test.exs @@ -594,7 +594,7 @@ test "with proper permissions and valid password", %{conn: conn, user: user} do user = User.get_by_id(user.id) assert user.deactivated == true assert user.name == nil - assert user.bio == nil + assert user.bio == "" assert user.password_hash == nil end end