post scheduling (#554)
* prepare codebase to create scheduled tasks
there is some prep work involved with this. the scheduler would be happy
if this work was done. simply, we extract out the `created_utc`
interface from *everything* that uses it such that we don't have to
repeat ourselves a bunch. all fun stuff.
next commit is the meat of it.
* cron: basic backend work for scheduler
* avoid ipmort loop
* attempt 2 at fixing import loops
* parathensize because operator precedence
* delete file that came back for some reason.
* does NOPing the oauth apps work?
* import late and undo clients.py change
* stringify column names.
* reorder imports.
* remove task reference
* fix missing mapper object
* make coupled to repeatabletask i guess
* sanitize: fix sanitize imports
* import shadowing crap
* re-shadow shadowed variable
* fix regexes
* use the correct not operator
* readd missing commit
* scheduler: SQLA only allows concrete relations
* implement submission scheduler
* fix import loop with db_session
* get rid of import loop in submission.py and comment.py
* remove import loops by deferring import until function clal
* i give up.
* awful.
* ...
* fix another app import loop
* fix missing import in route handler
* fix import error in wrappers.py
* fix wrapper error
* call update wrapper in the admin_level_required case
* :marseyshrug:
* fix issue with wrapper
* some cleanup and some fixes
* some more cleanup
let's avoid polluting scopes where we can.
* ...
* add SCHEDULED_POSTS permission.
* move const.py into config like the other files.
* style fixes.
* lock table for concurrency improvements
* don't attempt to commit on errors
* Refactor code, create `TaskRunContext`, create python callable task type.
* use import contextlib
* testing stuff i guess.
* handle repeatable tasks properly.
* Attempt another fix at fighting the mapper
* do it right ig
* SQLA1.4 doesn't support nested polymorphism ig
* fix errenous class import
* fix mapper errors
* import app in wrappers.py
* fix import failures and stuff like that.
* embed and import fixes
* minor formatting changes.
* Add running state enum and don't attempt to check for currently running tasks.
* isort
* documentation, style, and commit after each task.
* Add completion time and more docs, rename, etc
* document `CRON_SLEEP_SECONDS` better.
* add note about making LiteralString
* filter out tasks that have been run in the future
* reference RepeatableTask's `__tablename__` directly
* use a master/slave configuration for tasks
the master periodically checks to see if the slave is alive, healthy,
and not taking too many resources, and if applicable kills its
child and restarts it.
only one relation is supported at the moment.
* don't duplicate process unnecessarily
* note impl detail, add comments
* fix imports.
* getting imports to stop being stupid.
* environment notes.
* syntax derp
* *sigh*
* stupid environment stuff
* add UI for submitting a scheduled post
* stupid things i need to fix the user class
* ...
* fix template
* add formkey
* pass v
* add hour and minute field
* bleh
* remove concrete
* the sqlalchemy docs are wrong
* fix me being dumb and not understanding error messages
* missing author attribute for display
* author_name property
* it's a property
* with_polymorphic i think fixes this
* dsfavgnhmjk
* *sigh*
* okay try this again
* try getting rid of the comment section
* include -> extends
* put the div outside of the thing.
* fix user page listings :/
* mhm
* i hate this why isn't this working
* this should fix it
* Fix posts being set as disabled by default
* form UI imrpovements
* label
* <textarea>s should have their closing tag
* UI fixes.
* and fix errenous spinner thing.
* don't abort(415) when browsers send 0 length files for some reason
* UI improvements
* line break.
* CSS :S
* better explainer
* don't show moderation buttons for scheduled posts
* ...
* meh
* add edit form
* include forms on default page.
* fix hour minute selectino.
* improve ui i guess and add api
* Show previous postings on scheduled task page
* create task id
* sqla
* posts -> submissions
* fix OTM relationship
* edit URL
* use common formkey control
* Idk why this isn't working
* Revert "Idk why this isn't working"
This reverts commit 3b93f741df
.
* does removing viewonly fix it?
* don't import routes on db migrations
* apparently this has to be a string
* UI improvements redux
* margins and stuff
* add cron to supervisord
* remove stupid duplication
* typo fix
* postgres syntax error
* better lock and error handling
* add relationship between task and runs
* fix some ui stuff
* fix incorrect timestamp comparison
* ...
* Fix logic errors blocking scheduled posts
Two bugs here:
- RepeatableTask.run_time_last <= now: run_time_last is NULL by
default. NULL is not greater than, less than, or equal to any
value. We use NULL to signify a never-run task; check for that
condition when building the task list.
- `6 <= weekday <= 0`: there is no integer that is both gte 6 and
lte 0. This was always false.
* pasthrough worker process STDOUT and STDERR
* Add scheduler to admin panel
* scheduler
* fix listing and admin home
* date formatting ixes
* fix ages
* task user interface
* fix some more import crap i have to deal with
* fix typing
* avoid import loop
* UI fixes
* fix incorrect type
* task type
* Scheduled task UI improvements (add runs and stuff)
* make the width a lil bit smaller
* task runs.
* fix submit page
* add alembic migration
* log on startup
* Fix showing edit button
* Fix logic for `can_edit` (accidentally did `author_id` instead of `id`)
* Broad review pass
Review:
- Call `invalidate_cache` with `is_html=` explicitly for clarity,
rather than a bare boolean in the call args.
- Remove `marseys_const*` and associated stateful const system:
the implementation was good if we needed them, but TheMotte
doesn't use emoji, and a greenfield emoji system would likely
not keep those darned lists floating in thread-local scope.
Also they were only needed for goldens and random emoji, which
are fairly non-central features.
- Get `os.environ` fully out of the templates by using the new
constants we already have in files.helpers.config.environment.
- Given files.routes.posts cleanup,get rid of shop discount dict.
It's already a mapping of badge IDs to discounts for badges that
likely won't continue to exist (if they even do at present).
- RepeatableTaskRun.exception: use `@property.setter` instead of
overriding `__setattr__`.
Fix:
- Welcome message literal contained an indented Markdown code block.
- Condition to show "View source" button changed to show source to
logged out. This may well be a desirable change, but it's not
clearly intended here.
* Fix couple of routing issues
* fix 400 with post body editing
* Add error handler for HTTP 415
* fix router giving wrong arg name to handler
* Use supervisord to monitor memory rather than DIY
Also means we're using pip for getting supervisord now, so we don't rely
on the Debian image base for any packages.
* fix task run elapsed time display
* formatting and removing redundant code
* Fix missing ModAction import
* dates and times fixes
* Having to modify imports here anyway, might as
well change it.
* correct documentation.
* don't use urlunparse
* validators: import sanitize instead of from syntax
* cron: prevent races on task running
RepeatableTask.run_state_enum acts as the mutex on repeatable tasks.
Previously, the list of tasks to run was acquired before individually
locking each task. However, there was a period where the table is both
unlocked and the tasks are in state WAITING between those points.
This could potentially have led to two 'cron' processes each running the
same task simultaneously. Instead, we check for runnability both when
building the preliminary list and when mutexing the task via run state
in the database.
Also:
- g.db and the cron db object are both instances of `Session`, not
`scoped_session` because they are obtained from
`scoped_session.__call__`, which acts as a `Session` factory.
Propagate this to the type hints.
- Sort order of task run submissions so /tasks/scheduled_posts/<id>
"Previous Task Runs" listings are useful.
* Notify followers on post publication
This was old behavior lost in the refactoring of the submit endpoint.
Also fix an AttributeError in `Follow.__repr__` which carried over
from all the repr copypasta.
* Fix image attachment
Any check for `file.content_length` relies on browsers sending
Content-Length headers with the request. It seems that few actually do.
The pre-refactor approach was to check for truthiness, which excludes
both None and the strange empty strings that we seem to get in absence
of a file upload. We return to doing so.
---------
Co-authored-by: TLSM <duolsm@outlook.com>
This commit is contained in:
parent
9133d35e6f
commit
be952c2771
121 changed files with 3284 additions and 1808 deletions
|
@ -1,4 +1,5 @@
|
|||
from files.__main__ import app
|
||||
from files.helpers.config.const import FEATURES
|
||||
|
||||
from .admin import *
|
||||
from .comments import *
|
||||
|
|
|
@ -1,2 +1,3 @@
|
|||
from .admin import *
|
||||
from .performance import *
|
||||
from .tasks import *
|
||||
|
|
|
@ -1,19 +1,22 @@
|
|||
import json
|
||||
import time
|
||||
from datetime import datetime
|
||||
|
||||
from files.helpers.wrappers import *
|
||||
import requests
|
||||
|
||||
from files.classes import *
|
||||
from files.helpers.alerts import *
|
||||
from files.helpers.sanitize import *
|
||||
from files.helpers.security import *
|
||||
from files.helpers.caching import invalidate_cache
|
||||
from files.helpers.comments import comment_on_publish, comment_on_unpublish
|
||||
from files.helpers.config.const import *
|
||||
from files.helpers.config.environment import CF_HEADERS, CF_ZONE
|
||||
from files.helpers.get import *
|
||||
from files.helpers.media import *
|
||||
from files.helpers.const import *
|
||||
from files.classes import *
|
||||
from flask import *
|
||||
from files.helpers.sanitize import *
|
||||
from files.helpers.security import *
|
||||
from files.helpers.wrappers import *
|
||||
from files.__main__ import app, cache, limiter
|
||||
from ..front import frontlist
|
||||
from files.helpers.comments import comment_on_publish, comment_on_unpublish
|
||||
from datetime import datetime
|
||||
import requests
|
||||
from files.routes.importstar import *
|
||||
|
||||
month = datetime.now().strftime('%B')
|
||||
|
||||
|
@ -830,7 +833,7 @@ def shadowban(user_id, v):
|
|||
)
|
||||
g.db.add(ma)
|
||||
|
||||
cache.delete_memoized(frontlist)
|
||||
invalidate_cache(frontlist=True)
|
||||
|
||||
body = f"@{v.username} has shadowbanned @{user.username}"
|
||||
|
||||
|
@ -878,7 +881,7 @@ def unshadowban(user_id, v):
|
|||
)
|
||||
g.db.add(ma)
|
||||
|
||||
cache.delete_memoized(frontlist)
|
||||
invalidate_cache(frontlist=True)
|
||||
|
||||
g.db.commit()
|
||||
return {"message": "User unshadowbanned!"}
|
||||
|
@ -1108,7 +1111,7 @@ def ban_post(post_id, v):
|
|||
)
|
||||
g.db.add(ma)
|
||||
|
||||
cache.delete_memoized(frontlist)
|
||||
invalidate_cache(frontlist=True)
|
||||
|
||||
v.coins += 1
|
||||
g.db.add(v)
|
||||
|
@ -1144,7 +1147,7 @@ def unban_post(post_id, v):
|
|||
|
||||
g.db.add(post)
|
||||
|
||||
cache.delete_memoized(frontlist)
|
||||
invalidate_cache(frontlist=True)
|
||||
|
||||
v.coins -= 1
|
||||
g.db.add(v)
|
||||
|
@ -1213,7 +1216,7 @@ def sticky_post(post_id, v):
|
|||
if v.id != post.author_id:
|
||||
send_repeatable_notification(post.author_id, f"@{v.username} has pinned your [post](/post/{post_id})!")
|
||||
|
||||
cache.delete_memoized(frontlist)
|
||||
invalidate_cache(frontlist=True)
|
||||
g.db.commit()
|
||||
return {"message": "Post pinned!"}
|
||||
|
||||
|
@ -1239,7 +1242,7 @@ def unsticky_post(post_id, v):
|
|||
if v.id != post.author_id:
|
||||
send_repeatable_notification(post.author_id, f"@{v.username} has unpinned your [post](/post/{post_id})!")
|
||||
|
||||
cache.delete_memoized(frontlist)
|
||||
invalidate_cache(frontlist=True)
|
||||
g.db.commit()
|
||||
return {"message": "Post unpinned!"}
|
||||
|
||||
|
|
|
@ -6,7 +6,7 @@ from typing import Final
|
|||
import psutil
|
||||
from flask import abort, render_template, request
|
||||
|
||||
from files.helpers.const import PERMS
|
||||
from files.helpers.config.const import PERMS
|
||||
from files.helpers.time import format_datetime
|
||||
from files.helpers.wrappers import admin_level_required
|
||||
from files.__main__ import app
|
||||
|
|
179
files/routes/admin/tasks.py
Normal file
179
files/routes/admin/tasks.py
Normal file
|
@ -0,0 +1,179 @@
|
|||
from datetime import time
|
||||
from typing import Optional
|
||||
|
||||
from flask import abort, g, redirect, render_template, request
|
||||
|
||||
import files.helpers.validators as validators
|
||||
from files.__main__ import app
|
||||
from files.classes.cron.submission import ScheduledSubmissionTask
|
||||
from files.classes.cron.tasks import (DayOfWeek, RepeatableTask,
|
||||
RepeatableTaskRun, ScheduledTaskType)
|
||||
from files.classes.user import User
|
||||
from files.helpers.config.const import PERMS, SUBMISSION_FLAIR_LENGTH_MAXIMUM
|
||||
from files.helpers.config.environment import MULTIMEDIA_EMBEDDING_ENABLED
|
||||
from files.helpers.wrappers import admin_level_required
|
||||
|
||||
|
||||
def _modify_task_schedule(pid:int):
|
||||
task: Optional[RepeatableTask] = g.db.get(RepeatableTask, pid)
|
||||
if not task: abort(404)
|
||||
|
||||
# rebuild the schedule
|
||||
task.enabled = _get_request_bool('enabled')
|
||||
task.frequency_day_flags = _get_request_dayofweek()
|
||||
hour:int = validators.int_ranged('hour', 0, 23)
|
||||
minute:int = validators.int_ranged('minute', 0, 59)
|
||||
second:int = 0 # TODO: seconds?
|
||||
|
||||
time_of_day_utc:time = time(hour, minute, second)
|
||||
task.time_of_day_utc = time_of_day_utc
|
||||
g.db.commit()
|
||||
|
||||
@app.get('/tasks/')
|
||||
@admin_level_required(PERMS['SCHEDULER'])
|
||||
def tasks_get(v:User):
|
||||
tasks:list[RepeatableTask] = \
|
||||
g.db.query(RepeatableTask).all()
|
||||
return render_template("admin/tasks/tasks.html", v=v, listing=tasks)
|
||||
|
||||
|
||||
@app.get('/tasks/<int:task_id>/')
|
||||
@admin_level_required(PERMS['SCHEDULER'])
|
||||
def tasks_get_task(v:User, task_id:int):
|
||||
task:RepeatableTask = g.db.get(RepeatableTask, task_id)
|
||||
if not task: abort(404)
|
||||
return render_template("admin/tasks/single_task.html", v=v, task=task)
|
||||
|
||||
|
||||
@app.get('/tasks/<int:task_id>/runs/')
|
||||
@admin_level_required(PERMS['SCHEDULER'])
|
||||
def tasks_get_task_redirect(v:User, task_id:int): # pyright: ignore
|
||||
return redirect(f'/tasks/{task_id}/')
|
||||
|
||||
|
||||
@app.get('/tasks/<int:task_id>/runs/<int:run_id>')
|
||||
@admin_level_required(PERMS['SCHEDULER'])
|
||||
def tasks_get_task_run(v:User, task_id:int, run_id:int):
|
||||
run:RepeatableTaskRun = g.db.get(RepeatableTaskRun, run_id)
|
||||
if not run: abort(404)
|
||||
if run.task_id != task_id:
|
||||
return redirect(f'/tasks/{run.task_id}/runs/{run.id}')
|
||||
return render_template("admin/tasks/single_run.html", v=v, run=run)
|
||||
|
||||
|
||||
@app.post('/tasks/<int:task_id>/schedule')
|
||||
@admin_level_required(PERMS['SCHEDULER'])
|
||||
def task_schedule_post(v:User, task_id:int): # pyright: ignore
|
||||
_modify_task_schedule(task_id)
|
||||
return redirect(f'/tasks/{task_id}')
|
||||
|
||||
|
||||
@app.get('/tasks/scheduled_posts/')
|
||||
@admin_level_required(PERMS['SCHEDULER_POSTS'])
|
||||
def tasks_scheduled_posts_get(v:User):
|
||||
submissions:list[ScheduledSubmissionTask] = \
|
||||
g.db.query(ScheduledSubmissionTask).all()
|
||||
return render_template("admin/tasks/scheduled_posts.html", v=v, listing=submissions)
|
||||
|
||||
|
||||
def _get_request_bool(name:str) -> bool:
|
||||
return bool(request.values.get(name, default=False, type=bool))
|
||||
|
||||
|
||||
def _get_request_dayofweek() -> DayOfWeek:
|
||||
days:DayOfWeek = DayOfWeek.NONE
|
||||
for day in DayOfWeek.all_days:
|
||||
name:str = day.name.lower()
|
||||
if _get_request_bool(f'schedule_day_{name}'): days |= day
|
||||
return days
|
||||
|
||||
|
||||
@app.post('/tasks/scheduled_posts/')
|
||||
@admin_level_required(PERMS['SCHEDULER_POSTS'])
|
||||
def tasks_scheduled_posts_post(v:User):
|
||||
validated_post:validators.ValidatedSubmissionLike = \
|
||||
validators.ValidatedSubmissionLike.from_flask_request(request,
|
||||
allow_embedding=MULTIMEDIA_EMBEDDING_ENABLED,
|
||||
)
|
||||
|
||||
# first build the template
|
||||
flair:str = validators.guarded_value("flair", min_len=0, max_len=SUBMISSION_FLAIR_LENGTH_MAXIMUM)
|
||||
|
||||
# and then build the schedule
|
||||
enabled:bool = _get_request_bool('enabled')
|
||||
frequency_day:DayOfWeek = _get_request_dayofweek()
|
||||
hour:int = validators.int_ranged('hour', 0, 23)
|
||||
minute:int = validators.int_ranged('minute', 0, 59)
|
||||
second:int = 0 # TODO: seconds?
|
||||
|
||||
time_of_day_utc:time = time(hour, minute, second)
|
||||
|
||||
# and then build the scheduled task
|
||||
task:ScheduledSubmissionTask = ScheduledSubmissionTask(
|
||||
author_id=v.id,
|
||||
author_id_submission=v.id, # TODO: allow customization
|
||||
enabled=enabled,
|
||||
ghost=_get_request_bool("ghost"),
|
||||
private=_get_request_bool("private"),
|
||||
over_18=_get_request_bool("over_18"),
|
||||
is_bot=False, # TODO: do we need this?
|
||||
title=validated_post.title,
|
||||
url=validated_post.url,
|
||||
body=validated_post.body,
|
||||
body_html=validated_post.body_html,
|
||||
flair=flair,
|
||||
embed_url=validated_post.embed_slow,
|
||||
frequency_day=int(frequency_day),
|
||||
time_of_day_utc=time_of_day_utc,
|
||||
type_id=int(ScheduledTaskType.SCHEDULED_SUBMISSION),
|
||||
)
|
||||
g.db.add(task)
|
||||
g.db.commit()
|
||||
return redirect(f'/tasks/scheduled_posts/{task.id}')
|
||||
|
||||
|
||||
@app.get('/tasks/scheduled_posts/<int:pid>')
|
||||
@admin_level_required(PERMS['SCHEDULER_POSTS'])
|
||||
def tasks_scheduled_post_get(v:User, pid:int):
|
||||
submission: Optional[ScheduledSubmissionTask] = \
|
||||
g.db.get(ScheduledSubmissionTask, pid)
|
||||
if not submission: abort(404)
|
||||
return render_template("admin/tasks/scheduled_post.html", v=v, p=submission)
|
||||
|
||||
|
||||
@app.post('/tasks/scheduled_posts/<int:pid>/content')
|
||||
@admin_level_required(PERMS['SCHEDULER_POSTS'])
|
||||
def task_scheduled_post_content_post(v:User, pid:int): # pyright: ignore
|
||||
submission: Optional[ScheduledSubmissionTask] = \
|
||||
g.db.get(ScheduledSubmissionTask, pid)
|
||||
if not submission: abort(404)
|
||||
if not v.can_edit(submission): abort(403)
|
||||
|
||||
validated_post:validators.ValidatedSubmissionLike = \
|
||||
validators.ValidatedSubmissionLike.from_flask_request(request,
|
||||
allow_embedding=MULTIMEDIA_EMBEDDING_ENABLED,
|
||||
)
|
||||
|
||||
edited:bool = False
|
||||
if submission.body != validated_post.body:
|
||||
submission.body = validated_post.body
|
||||
submission.body_html = validated_post.body_html
|
||||
edited = True
|
||||
|
||||
if submission.title != validated_post.title:
|
||||
submission.title = validated_post.title
|
||||
edited = True
|
||||
|
||||
if not edited:
|
||||
abort(400, "Title or body must be edited")
|
||||
|
||||
g.db.commit()
|
||||
return redirect(f'/tasks/scheduled_posts/{pid}')
|
||||
|
||||
@app.post('/tasks/scheduled_posts/<int:task_id>/schedule')
|
||||
@admin_level_required(PERMS['SCHEDULER'])
|
||||
def task_scheduled_post_post(v:User, task_id:int): # pyright: ignore
|
||||
# permission being SCHEDULER is intentional as SCHEDULER_POSTS is for
|
||||
# creating or editing post content
|
||||
_modify_task_schedule(task_id)
|
||||
return redirect(f'/tasks/scheduled_posts/{task_id}')
|
51
files/routes/allroutes.py
Normal file
51
files/routes/allroutes.py
Normal file
|
@ -0,0 +1,51 @@
|
|||
import json
|
||||
import sys
|
||||
import time
|
||||
|
||||
from flask import abort, g, request
|
||||
|
||||
from files.__main__ import app, db_session, limiter
|
||||
|
||||
|
||||
@app.before_request
|
||||
def before_request():
|
||||
with open('site_settings.json', 'r') as f:
|
||||
app.config['SETTINGS'] = json.load(f)
|
||||
|
||||
if request.host != app.config["SERVER_NAME"]:
|
||||
return {"error": "Unauthorized host provided."}, 403
|
||||
|
||||
if not app.config['SETTINGS']['Bots'] and request.headers.get("Authorization"):
|
||||
abort(403, "Bots are currently not allowed")
|
||||
|
||||
g.agent = request.headers.get("User-Agent")
|
||||
if not g.agent:
|
||||
return 'Please use a "User-Agent" header!', 403
|
||||
|
||||
ua = g.agent.lower()
|
||||
g.debug = app.debug
|
||||
g.webview = ('; wv) ' in ua)
|
||||
g.inferior_browser = (
|
||||
'iphone' in ua or
|
||||
'ipad' in ua or
|
||||
'ipod' in ua or
|
||||
'mac os' in ua or
|
||||
' firefox/' in ua)
|
||||
g.timestamp = int(time.time())
|
||||
|
||||
limiter.check()
|
||||
|
||||
g.db = db_session()
|
||||
|
||||
|
||||
@app.teardown_appcontext
|
||||
def teardown_request(error):
|
||||
if hasattr(g, 'db') and g.db:
|
||||
g.db.close()
|
||||
sys.stdout.flush()
|
||||
|
||||
@app.after_request
|
||||
def after_request(response):
|
||||
response.headers.add("Strict-Transport-Security", "max-age=31536000")
|
||||
response.headers.add("X-Frame-Options", "deny")
|
||||
return response
|
|
@ -2,9 +2,8 @@ from files.__main__ import app, limiter
|
|||
from files.helpers.wrappers import *
|
||||
from files.helpers.alerts import *
|
||||
from files.helpers.get import *
|
||||
from files.helpers.const import *
|
||||
from files.helpers.config.const import *
|
||||
from files.classes.award import *
|
||||
from .front import frontlist
|
||||
from flask import g, request
|
||||
from files.helpers.sanitize import filter_emojis_only
|
||||
from copy import deepcopy
|
||||
|
@ -24,7 +23,7 @@ def shop(v):
|
|||
|
||||
for val in AWARDS.values():
|
||||
val["baseprice"] = int(val["price"])
|
||||
val["price"] = int(val["price"] * v.discount)
|
||||
val["price"] = val["baseprice"]
|
||||
|
||||
sales = g.db.query(func.sum(User.coins_spent)).scalar()
|
||||
return render_template("shop.html", awards=list(AWARDS.values()), v=v, sales=sales)
|
||||
|
@ -46,7 +45,7 @@ def buy(v, award):
|
|||
if award not in AWARDS: abort(400)
|
||||
og_price = AWARDS[award]["price"]
|
||||
|
||||
price = int(og_price * v.discount)
|
||||
price = int(og_price)
|
||||
|
||||
if request.values.get("mb"):
|
||||
if v.procoins < price: abort(400, "Not enough marseybux.")
|
||||
|
|
|
@ -1,7 +1,8 @@
|
|||
import time
|
||||
from files.helpers.config.environment import SITE, SITE_FULL
|
||||
from files.helpers.wrappers import auth_required
|
||||
from files.helpers.sanitize import sanitize
|
||||
from files.helpers.const import *
|
||||
from files.helpers.config.const import *
|
||||
from datetime import datetime
|
||||
from flask_socketio import SocketIO, emit
|
||||
from files.__main__ import app, limiter, cache
|
||||
|
|
|
@ -1,16 +1,12 @@
|
|||
from files.helpers.wrappers import *
|
||||
from files.helpers.alerts import *
|
||||
from files.helpers.media import process_image
|
||||
from files.helpers.const import *
|
||||
from files.helpers.comments import comment_on_publish
|
||||
from files.classes import *
|
||||
from flask import *
|
||||
from files.__main__ import app, limiter
|
||||
from files.helpers.sanitize import filter_emojis_only
|
||||
import requests
|
||||
from shutil import copyfile
|
||||
from json import loads
|
||||
from collections import Counter
|
||||
from files.classes import *
|
||||
from files.helpers.alerts import *
|
||||
from files.helpers.comments import comment_on_publish
|
||||
from files.helpers.config.const import *
|
||||
from files.helpers.media import process_image
|
||||
from files.helpers.wrappers import *
|
||||
from files.routes.importstar import *
|
||||
|
||||
|
||||
@app.get("/comment/<cid>")
|
||||
@app.get("/post/<pid>/<anything>/<cid>")
|
||||
|
@ -183,9 +179,9 @@ def api_comment(v):
|
|||
).all()
|
||||
|
||||
threshold = app.config["COMMENT_SPAM_COUNT_THRESHOLD"]
|
||||
if v.age >= (60 * 60 * 24 * 7):
|
||||
if v.age_seconds >= (60 * 60 * 24 * 7):
|
||||
threshold *= 3
|
||||
elif v.age >= (60 * 60 * 24):
|
||||
elif v.age_seconds >= (60 * 60 * 24):
|
||||
threshold *= 2
|
||||
|
||||
if len(similar_comments) > threshold:
|
||||
|
@ -280,11 +276,11 @@ def edit_comment(cid, v):
|
|||
).all()
|
||||
|
||||
threshold = app.config["SPAM_SIMILAR_COUNT_THRESHOLD"]
|
||||
if v.age >= (60 * 60 * 24 * 30):
|
||||
if v.age_seconds >= (60 * 60 * 24 * 30):
|
||||
threshold *= 4
|
||||
elif v.age >= (60 * 60 * 24 * 7):
|
||||
elif v.age_seconds >= (60 * 60 * 24 * 7):
|
||||
threshold *= 3
|
||||
elif v.age >= (60 * 60 * 24):
|
||||
elif v.age_seconds >= (60 * 60 * 24):
|
||||
threshold *= 2
|
||||
|
||||
if len(similar_comments) > threshold:
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
from secrets import token_hex
|
||||
from flask import session, redirect, request
|
||||
|
||||
from files.helpers.const import PERMS
|
||||
from files.helpers.config.const import PERMS
|
||||
from files.helpers.get import get_user
|
||||
from files.helpers.wrappers import admin_level_required
|
||||
from files.__main__ import app
|
||||
|
|
|
@ -4,8 +4,11 @@ from urllib.parse import quote, urlencode
|
|||
|
||||
from flask import g, redirect, render_template, request, session
|
||||
|
||||
from files.helpers.const import ERROR_MESSAGES, SITE_FULL, WERKZEUG_ERROR_DESCRIPTIONS
|
||||
from files.__main__ import app
|
||||
from files.helpers.config.const import (ERROR_MESSAGES,
|
||||
WERKZEUG_ERROR_DESCRIPTIONS)
|
||||
from files.helpers.config.environment import SITE_FULL
|
||||
|
||||
|
||||
@app.errorhandler(400)
|
||||
@app.errorhandler(401)
|
||||
|
@ -14,6 +17,7 @@ from files.__main__ import app
|
|||
@app.errorhandler(405)
|
||||
@app.errorhandler(409)
|
||||
@app.errorhandler(413)
|
||||
@app.errorhandler(415)
|
||||
@app.errorhandler(422)
|
||||
@app.errorhandler(429)
|
||||
def error(e):
|
||||
|
|
|
@ -1,13 +1,14 @@
|
|||
import html
|
||||
from .front import frontlist
|
||||
from datetime import datetime
|
||||
from files.helpers.get import *
|
||||
from yattag import Doc
|
||||
from files.helpers.const import *
|
||||
from files.helpers.wrappers import *
|
||||
from files.helpers.jinja2 import *
|
||||
|
||||
from yattag import Doc
|
||||
|
||||
import files.helpers.listing as listing
|
||||
from files.__main__ import app
|
||||
from files.helpers.config.const import *
|
||||
from files.helpers.get import *
|
||||
from files.helpers.jinja2 import *
|
||||
from files.helpers.wrappers import *
|
||||
|
||||
|
||||
@app.get('/rss')
|
||||
@app.get('/feed')
|
||||
|
@ -16,7 +17,7 @@ def feeds_front(sort='new', t='all'):
|
|||
try: page = max(int(request.values.get("page", 1)), 1)
|
||||
except: page = 1
|
||||
|
||||
ids, next_exists = frontlist(
|
||||
ids, next_exists = listing.frontlist(
|
||||
sort=sort,
|
||||
page=page,
|
||||
t=t,
|
||||
|
|
|
@ -1,15 +1,14 @@
|
|||
from sqlalchemy.orm import Query
|
||||
|
||||
from files.helpers.wrappers import *
|
||||
from files.helpers.get import *
|
||||
from files.helpers.strings import sql_ilike_clean
|
||||
from files.__main__ import app, cache, limiter
|
||||
import files.helpers.listing as listing
|
||||
from files.__main__ import app, limiter
|
||||
from files.classes.submission import Submission
|
||||
from files.helpers.comments import comment_filter_moderated
|
||||
from files.helpers.contentsorting import \
|
||||
apply_time_filter, sort_objects, sort_comment_results
|
||||
|
||||
defaulttimefilter = environ.get("DEFAULT_TIME_FILTER", "all").strip()
|
||||
from files.helpers.contentsorting import (apply_time_filter,
|
||||
sort_comment_results, sort_objects)
|
||||
from files.helpers.config.environment import DEFAULT_TIME_FILTER
|
||||
from files.helpers.get import *
|
||||
from files.helpers.wrappers import *
|
||||
|
||||
@app.post("/clear")
|
||||
@auth_required
|
||||
|
@ -198,7 +197,7 @@ def front_all(v, sub=None, subdomain=None):
|
|||
try: lt=int(request.values.get("before", 0))
|
||||
except: lt=0
|
||||
|
||||
ids, next_exists = frontlist(sort=sort,
|
||||
ids, next_exists = listing.frontlist(sort=sort,
|
||||
page=page,
|
||||
t=t,
|
||||
v=v,
|
||||
|
@ -233,90 +232,6 @@ def front_all(v, sub=None, subdomain=None):
|
|||
return render_template("home.html", v=v, listing=posts, next_exists=next_exists, sort=sort, t=t, page=page, ccmode=ccmode, sub=sub, home=True)
|
||||
|
||||
|
||||
|
||||
@cache.memoize(timeout=86400)
|
||||
def frontlist(v=None, sort='new', page=1, t="all", ids_only=True, ccmode="false", filter_words='', gt=0, lt=0, sub=None, site=None):
|
||||
|
||||
posts = g.db.query(Submission)
|
||||
|
||||
if v and v.hidevotedon:
|
||||
voted = [x[0] for x in g.db.query(Vote.submission_id).filter_by(user_id=v.id).all()]
|
||||
posts = posts.filter(Submission.id.notin_(voted))
|
||||
|
||||
if not v or v.admin_level < 2:
|
||||
filter_clause = (Submission.filter_state != 'filtered') & (Submission.filter_state != 'removed')
|
||||
if v:
|
||||
filter_clause = filter_clause | (Submission.author_id == v.id)
|
||||
posts = posts.filter(filter_clause)
|
||||
|
||||
if sub: posts = posts.filter_by(sub=sub.name)
|
||||
elif v: posts = posts.filter(or_(Submission.sub == None, Submission.sub.notin_(v.all_blocks)))
|
||||
|
||||
if gt: posts = posts.filter(Submission.created_utc > gt)
|
||||
if lt: posts = posts.filter(Submission.created_utc < lt)
|
||||
|
||||
if not gt and not lt:
|
||||
posts = apply_time_filter(posts, t, Submission)
|
||||
|
||||
if (ccmode == "true"):
|
||||
posts = posts.filter(Submission.club == True)
|
||||
|
||||
posts = posts.filter_by(is_banned=False, private=False, deleted_utc = 0)
|
||||
|
||||
if ccmode == "false" and not gt and not lt:
|
||||
posts = posts.filter_by(stickied=None)
|
||||
|
||||
if v and v.admin_level < 2:
|
||||
posts = posts.filter(Submission.author_id.notin_(v.userblocks))
|
||||
|
||||
if not (v and v.changelogsub):
|
||||
posts=posts.filter(not_(Submission.title.ilike('[changelog]%')))
|
||||
|
||||
if v and filter_words:
|
||||
for word in filter_words:
|
||||
word = sql_ilike_clean(word).strip()
|
||||
posts=posts.filter(not_(Submission.title.ilike(f'%{word}%')))
|
||||
|
||||
if not (v and v.shadowbanned):
|
||||
posts = posts.join(User, User.id == Submission.author_id).filter(User.shadowbanned == None)
|
||||
|
||||
posts = sort_objects(posts, sort, Submission)
|
||||
|
||||
if v: size = v.frontsize or 0
|
||||
else: size = 25
|
||||
|
||||
posts = posts.offset(size * (page - 1)).limit(size+1).all()
|
||||
|
||||
next_exists = (len(posts) > size)
|
||||
|
||||
posts = posts[:size]
|
||||
|
||||
if page == 1 and ccmode == "false" and not gt and not lt:
|
||||
pins = g.db.query(Submission).filter(Submission.stickied != None, Submission.is_banned == False)
|
||||
if sub: pins = pins.filter_by(sub=sub.name)
|
||||
elif v:
|
||||
pins = pins.filter(or_(Submission.sub == None, Submission.sub.notin_(v.all_blocks)))
|
||||
if v.admin_level < 2:
|
||||
pins = pins.filter(Submission.author_id.notin_(v.userblocks))
|
||||
|
||||
pins = pins.all()
|
||||
|
||||
for pin in pins:
|
||||
if pin.stickied_utc and int(time.time()) > pin.stickied_utc:
|
||||
pin.stickied = None
|
||||
pin.stickied_utc = None
|
||||
g.db.add(pin)
|
||||
pins.remove(pin)
|
||||
|
||||
posts = pins + posts
|
||||
|
||||
if ids_only: posts = [x.id for x in posts]
|
||||
|
||||
g.db.commit()
|
||||
|
||||
return posts, next_exists
|
||||
|
||||
|
||||
@app.get("/changelog")
|
||||
@auth_required
|
||||
def changelog(v):
|
||||
|
@ -326,7 +241,7 @@ def changelog(v):
|
|||
sort=request.values.get("sort", "new")
|
||||
t=request.values.get('t', "all")
|
||||
|
||||
ids = changeloglist(sort=sort,
|
||||
ids = listing.changeloglist(sort=sort,
|
||||
page=page,
|
||||
t=t,
|
||||
v=v,
|
||||
|
@ -342,26 +257,6 @@ def changelog(v):
|
|||
return render_template("changelog.html", v=v, listing=posts, next_exists=next_exists, sort=sort, t=t, page=page)
|
||||
|
||||
|
||||
@cache.memoize(timeout=86400)
|
||||
def changeloglist(v=None, sort="new", page=1, t="all", site=None):
|
||||
|
||||
posts = g.db.query(Submission.id).filter_by(is_banned=False, private=False,).filter(Submission.deleted_utc == 0)
|
||||
|
||||
if v.admin_level < 2:
|
||||
posts = posts.filter(Submission.author_id.notin_(v.userblocks))
|
||||
|
||||
admins = [x[0] for x in g.db.query(User.id).filter(User.admin_level > 0).all()]
|
||||
posts = posts.filter(Submission.title.ilike('_changelog%'), Submission.author_id.in_(admins))
|
||||
|
||||
if t != 'all':
|
||||
posts = apply_time_filter(posts, t, Submission)
|
||||
posts = sort_objects(posts, sort, Submission)
|
||||
|
||||
posts = posts.offset(25 * (page - 1)).limit(26).all()
|
||||
|
||||
return [x[0] for x in posts]
|
||||
|
||||
|
||||
@app.get("/random_post")
|
||||
def random_post():
|
||||
p = g.db.query(Submission.id).filter(Submission.deleted_utc == 0, Submission.is_banned == False, Submission.private == False).order_by(func.random()).first()
|
||||
|
@ -387,7 +282,7 @@ def random_user():
|
|||
def all_comments(v):
|
||||
page = max(request.values.get("page", 1, int), 1)
|
||||
sort = request.values.get("sort", "new")
|
||||
time_filter = request.values.get("t", defaulttimefilter)
|
||||
time_filter = request.values.get("t", DEFAULT_TIME_FILTER)
|
||||
time_gt = request.values.get("after", 0, int)
|
||||
time_lt = request.values.get("before", 0, int)
|
||||
|
||||
|
|
17
files/routes/importstar.py
Normal file
17
files/routes/importstar.py
Normal file
|
@ -0,0 +1,17 @@
|
|||
'''
|
||||
Module that can be safely imported with the syntax
|
||||
`from files.routes.importstar import *`. This essentially
|
||||
contains flask stuff for routes that are used by pretty much
|
||||
all routes.
|
||||
|
||||
This should only be used from the route handlers. Flask imports
|
||||
are used in pretty much every place, but they shouldn't be used
|
||||
from the models if at all possible.
|
||||
|
||||
Ideally we'd import only what we need but this is just for ease
|
||||
of development. Feel free to remove.
|
||||
'''
|
||||
|
||||
from flask import (Response, abort, g, jsonify, make_response, redirect,
|
||||
render_template, request, send_file, send_from_directory,
|
||||
session)
|
|
@ -1,7 +1,8 @@
|
|||
from urllib.parse import urlencode
|
||||
from files.helpers.config.environment import HCAPTCHA_SECRET, HCAPTCHA_SITEKEY, WELCOME_MSG
|
||||
from files.mail import *
|
||||
from files.__main__ import app, limiter
|
||||
from files.helpers.const import *
|
||||
from files.helpers.config.const import *
|
||||
from files.helpers.captcha import validate_captcha
|
||||
|
||||
@app.get("/login")
|
||||
|
@ -201,7 +202,7 @@ def sign_up_get(v):
|
|||
|
||||
formkey_hashstr = str(now) + token + agent
|
||||
|
||||
formkey = hmac.new(key=bytes(environ.get("MASTER_KEY"), "utf-16"),
|
||||
formkey = hmac.new(key=bytes(SECRET_KEY, "utf-16"),
|
||||
msg=bytes(formkey_hashstr, "utf-16"),
|
||||
digestmod='md5'
|
||||
).hexdigest()
|
||||
|
@ -212,7 +213,7 @@ def sign_up_get(v):
|
|||
formkey=formkey,
|
||||
now=now,
|
||||
ref_user=ref_user,
|
||||
hcaptcha=app.config["HCAPTCHA_SITEKEY"],
|
||||
hcaptcha=HCAPTCHA_SITEKEY,
|
||||
error=error
|
||||
)
|
||||
|
||||
|
@ -237,7 +238,7 @@ def sign_up_post(v):
|
|||
|
||||
correct_formkey_hashstr = form_timestamp + submitted_token + agent
|
||||
|
||||
correct_formkey = hmac.new(key=bytes(environ.get("MASTER_KEY"), "utf-16"),
|
||||
correct_formkey = hmac.new(key=bytes(SECRET_KEY, "utf-16"),
|
||||
msg=bytes(correct_formkey_hashstr, "utf-16"),
|
||||
digestmod='md5'
|
||||
).hexdigest()
|
||||
|
@ -289,8 +290,7 @@ def sign_up_post(v):
|
|||
if existing_account:
|
||||
return signup_error("An account with that username already exists.")
|
||||
|
||||
if not validate_captcha(app.config.get("HCAPTCHA_SECRET", ""),
|
||||
app.config.get("HCAPTCHA_SITEKEY", ""),
|
||||
if not validate_captcha(HCAPTCHA_SECRET, HCAPTCHA_SITEKEY,
|
||||
request.values.get("h-captcha-response", "")):
|
||||
return signup_error("Unable to verify CAPTCHA")
|
||||
|
||||
|
|
|
@ -1,12 +1,14 @@
|
|||
from files.helpers.wrappers import *
|
||||
from files.helpers.alerts import *
|
||||
from files.helpers.get import *
|
||||
from files.helpers.const import *
|
||||
from files.classes import *
|
||||
from flask import *
|
||||
from files.__main__ import app, limiter
|
||||
import sqlalchemy.exc
|
||||
|
||||
from files.__main__ import app, limiter
|
||||
from files.classes import *
|
||||
from files.helpers.alerts import *
|
||||
from files.helpers.config.const import *
|
||||
from files.helpers.get import *
|
||||
from files.helpers.wrappers import *
|
||||
from files.routes.importstar import *
|
||||
|
||||
|
||||
@app.get("/authorize")
|
||||
@auth_required
|
||||
def authorize_prompt(v):
|
||||
|
@ -208,9 +210,6 @@ def admin_app_reject(v, aid):
|
|||
@app.get("/admin/app/<aid>")
|
||||
@admin_level_required(2)
|
||||
def admin_app_id(v, aid):
|
||||
|
||||
aid=aid
|
||||
|
||||
oauth = g.db.query(OauthApp).filter_by(id=aid).one_or_none()
|
||||
|
||||
pids=oauth.idlist(page=int(request.values.get("page",1)))
|
||||
|
@ -230,13 +229,9 @@ def admin_app_id(v, aid):
|
|||
@app.get("/admin/app/<aid>/comments")
|
||||
@admin_level_required(2)
|
||||
def admin_app_id_comments(v, aid):
|
||||
|
||||
aid=aid
|
||||
|
||||
oauth = g.db.query(OauthApp).filter_by(id=aid).one_or_none()
|
||||
|
||||
cids=oauth.comments_idlist(page=int(request.values.get("page",1)),
|
||||
)
|
||||
cids=oauth.comments_idlist(page=int(request.values.get("page",1)))
|
||||
|
||||
next_exists=len(cids)==101
|
||||
cids=cids[:100]
|
||||
|
|
|
@ -1,61 +1,40 @@
|
|||
import time
|
||||
import gevent
|
||||
from files.helpers.wrappers import *
|
||||
from files.helpers.sanitize import *
|
||||
from files.helpers.alerts import *
|
||||
from files.helpers.comments import comment_filter_moderated
|
||||
from files.helpers.contentsorting import sort_objects
|
||||
from files.helpers.const import *
|
||||
from files.helpers.media import process_image
|
||||
from files.helpers.strings import sql_ilike_clean
|
||||
from files.classes import *
|
||||
from flask import *
|
||||
import urllib.parse
|
||||
from io import BytesIO
|
||||
from files.__main__ import app, limiter, cache, db_session
|
||||
from PIL import Image as PILimage
|
||||
from .front import frontlist, changeloglist
|
||||
from urllib.parse import ParseResult, urlunparse, urlparse, quote, unquote
|
||||
from os import path
|
||||
import requests
|
||||
from shutil import copyfile
|
||||
from sys import stdout
|
||||
from urllib.parse import ParseResult, urlparse
|
||||
|
||||
import gevent
|
||||
import requests
|
||||
import werkzeug.wrappers
|
||||
from PIL import Image as PILimage
|
||||
from sqlalchemy.orm import Query
|
||||
|
||||
import files.helpers.validators as validators
|
||||
from files.__main__ import app, cache, db_session, limiter
|
||||
from files.classes import *
|
||||
from files.helpers.alerts import *
|
||||
from files.helpers.caching import invalidate_cache
|
||||
from files.helpers.comments import comment_filter_moderated
|
||||
from files.helpers.config.const import *
|
||||
from files.helpers.content import canonicalize_url2
|
||||
from files.helpers.contentsorting import sort_objects
|
||||
from files.helpers.media import process_image
|
||||
from files.helpers.sanitize import *
|
||||
from files.helpers.strings import sql_ilike_clean
|
||||
from files.helpers.wrappers import *
|
||||
from files.routes.importstar import *
|
||||
|
||||
snappyquotes = [f':#{x}:' for x in marseys_const2]
|
||||
|
||||
if path.exists(f'snappy_{SITE_ID}.txt'):
|
||||
with open(f'snappy_{SITE_ID}.txt', "r", encoding="utf-8") as f:
|
||||
snappyquotes += f.read().split("\n{[para]}\n")
|
||||
|
||||
discounts = {
|
||||
69: 0.02,
|
||||
70: 0.04,
|
||||
71: 0.06,
|
||||
72: 0.08,
|
||||
73: 0.10,
|
||||
titleheaders = {
|
||||
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/89.0.4389.72 Safari/537.36"
|
||||
}
|
||||
|
||||
titleheaders = {"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/89.0.4389.72 Safari/537.36"}
|
||||
|
||||
MAX_TITLE_LENGTH = 500
|
||||
MAX_URL_LENGTH = 2048
|
||||
MAX_BODY_LENGTH = SUBMISSION_BODY_LENGTH_MAXIMUM
|
||||
|
||||
|
||||
def guarded_value(val, min_len, max_len) -> str:
|
||||
'''
|
||||
Get request value `val` and ensure it is within length constraints
|
||||
Requires a request context and either aborts early or returns a good value
|
||||
'''
|
||||
raw = request.values.get(val, '').strip()
|
||||
raw = raw.replace('\u200e', '')
|
||||
|
||||
if len(raw) < min_len: abort(400, f"Minimum length for {val} is {min_len}")
|
||||
if len(raw) > max_len: abort(400, f"Maximum length for {val} is {max_len}")
|
||||
# TODO: it may make sense to do more sanitisation here
|
||||
return raw
|
||||
|
||||
@app.post("/toggle_club/<pid>")
|
||||
@auth_required
|
||||
def toggle_club(pid, v):
|
||||
|
@ -84,32 +63,8 @@ def publish(pid, v):
|
|||
post.created_utc = int(time.time())
|
||||
g.db.add(post)
|
||||
|
||||
if not post.ghost:
|
||||
notify_users = NOTIFY_USERS(f'{post.title} {post.body}', v)
|
||||
|
||||
if notify_users:
|
||||
cid = notif_comment2(post)
|
||||
for x in notify_users:
|
||||
add_notif(cid, x)
|
||||
|
||||
if v.followers:
|
||||
text = f"@{v.username} has made a new post: [{post.title}]({post.shortlink})"
|
||||
if post.sub: text += f" in <a href='/h/{post.sub}'>/h/{post.sub}"
|
||||
|
||||
cid = notif_comment(text, autojanny=True)
|
||||
for follow in v.followers:
|
||||
user = get_account(follow.user_id)
|
||||
if post.club and not user.paid_dues: continue
|
||||
add_notif(cid, user.id)
|
||||
|
||||
post.publish()
|
||||
g.db.commit()
|
||||
|
||||
cache.delete_memoized(frontlist)
|
||||
cache.delete_memoized(User.userpagelisting)
|
||||
|
||||
if v.admin_level > 0 and ("[changelog]" in post.title.lower() or "(changelog)" in post.title.lower()):
|
||||
cache.delete_memoized(changeloglist)
|
||||
|
||||
return redirect(post.permalink)
|
||||
|
||||
@app.get("/submit")
|
||||
|
@ -294,49 +249,37 @@ def morecomments(v, cid):
|
|||
return render_template("comments.html", v=v, comments=comments, p=p, render_replies=True, ajax=True)
|
||||
|
||||
|
||||
@app.post("/edit_post/<pid>")
|
||||
@app.post("/edit_post/<int:pid>")
|
||||
@limiter.limit("1/second;30/minute;200/hour;1000/day")
|
||||
@auth_required
|
||||
def edit_post(pid, v):
|
||||
p = get_post(pid)
|
||||
if not v.can_edit(p): abort(403)
|
||||
|
||||
if p.author_id != v.id and not (v.admin_level > 1 and v.admin_level > 2): abort(403)
|
||||
validated_post: validators.ValidatedSubmissionLike = \
|
||||
validators.ValidatedSubmissionLike.from_flask_request(
|
||||
request,
|
||||
allow_embedding=MULTIMEDIA_EMBEDDING_ENABLED,
|
||||
allow_media_url_upload=False,
|
||||
embed_url_file_key="file",
|
||||
edit=True
|
||||
)
|
||||
changed:bool=False
|
||||
|
||||
title = guarded_value("title", 1, MAX_TITLE_LENGTH)
|
||||
title = sanitize_raw(title, allow_newlines=False, length_limit=MAX_TITLE_LENGTH)
|
||||
if validated_post.title != p.title:
|
||||
p.title = validated_post.title
|
||||
p.title_html = validated_post.title_html
|
||||
changed = True
|
||||
|
||||
body = guarded_value("body", 0, MAX_BODY_LENGTH)
|
||||
body = sanitize_raw(body, allow_newlines=True, length_limit=MAX_BODY_LENGTH)
|
||||
if validated_post.body != p.body:
|
||||
p.body = validated_post.body
|
||||
p.body_html = validated_post.body_html
|
||||
changed = True
|
||||
|
||||
if title != p.title:
|
||||
p.title = title
|
||||
title_html = filter_emojis_only(title, edit=True)
|
||||
p.title_html = title_html
|
||||
if not changed:
|
||||
abort(400, "You need to change something")
|
||||
|
||||
if request.files.get("file") and request.headers.get("cf-ipcountry") != "T1":
|
||||
files = request.files.getlist('file')[:4]
|
||||
for file in files:
|
||||
if file.content_type.startswith('image/'):
|
||||
name = f'/images/{time.time()}'.replace('.','') + '.webp'
|
||||
file.save(name)
|
||||
url = process_image(name)
|
||||
if app.config['MULTIMEDIA_EMBEDDING_ENABLED']:
|
||||
body += f"\n\n"
|
||||
else:
|
||||
body += f'\n\n<a href="{url}">{url}</a>'
|
||||
else: abort(400, "Image files only")
|
||||
|
||||
body_html = sanitize(body, edit=True)
|
||||
|
||||
p.body = body
|
||||
p.body_html = body_html
|
||||
|
||||
if not p.private and not p.ghost:
|
||||
notify_users = NOTIFY_USERS(f'{p.title} {p.body}', v)
|
||||
if notify_users:
|
||||
cid = notif_comment2(p)
|
||||
for x in notify_users:
|
||||
add_notif(cid, x)
|
||||
p.publish()
|
||||
|
||||
if v.id == p.author_id:
|
||||
if int(time.time()) - p.created_utc > 60 * 3: p.edited_utc = int(time.time())
|
||||
|
@ -353,17 +296,16 @@ def edit_post(pid, v):
|
|||
|
||||
return redirect(p.permalink)
|
||||
|
||||
|
||||
def archiveorg(url):
|
||||
try: requests.get(f'https://web.archive.org/save/{url}', headers={'User-Agent': 'Mozilla/4.0 (compatible; MSIE 5.5; Windows NT)'}, timeout=100)
|
||||
except: pass
|
||||
|
||||
|
||||
def thumbnail_thread(pid):
|
||||
|
||||
db = db_session()
|
||||
|
||||
def expand_url(post_url, fragment_url):
|
||||
|
||||
if fragment_url.startswith("https://"):
|
||||
return fragment_url
|
||||
elif fragment_url.startswith("https://"):
|
||||
|
@ -399,8 +341,6 @@ def thumbnail_thread(pid):
|
|||
if x.status_code != 200:
|
||||
db.close()
|
||||
return
|
||||
|
||||
|
||||
|
||||
if x.headers.get("Content-Type","").startswith("text/html"):
|
||||
soup=BeautifulSoup(x.content, 'lxml')
|
||||
|
@ -408,15 +348,13 @@ def thumbnail_thread(pid):
|
|||
thumb_candidate_urls=[]
|
||||
|
||||
meta_tags = [
|
||||
"drama:thumbnail",
|
||||
"themotte:thumbnail",
|
||||
"twitter:image",
|
||||
"og:image",
|
||||
"thumbnail"
|
||||
]
|
||||
|
||||
for tag_name in meta_tags:
|
||||
|
||||
|
||||
tag = soup.find(
|
||||
'meta',
|
||||
attrs={
|
||||
|
@ -501,40 +439,7 @@ def api_is_repost():
|
|||
url = request.values.get('url')
|
||||
if not url: abort(400)
|
||||
|
||||
for rd in ("://reddit.com", "://new.reddit.com", "://www.reddit.com", "://redd.it", "://libredd.it", "://teddit.net"):
|
||||
url = url.replace(rd, "://old.reddit.com")
|
||||
|
||||
url = url.replace("nitter.net", "twitter.com").replace("old.reddit.com/gallery", "reddit.com/gallery").replace("https://youtu.be/", "https://youtube.com/watch?v=").replace("https://music.youtube.com/watch?v=", "https://youtube.com/watch?v=").replace("https://streamable.com/", "https://streamable.com/e/").replace("https://youtube.com/shorts/", "https://youtube.com/watch?v=").replace("https://mobile.twitter", "https://twitter").replace("https://m.facebook", "https://facebook").replace("m.wikipedia.org", "wikipedia.org").replace("https://m.youtube", "https://youtube").replace("https://www.youtube", "https://youtube").replace("https://www.twitter", "https://twitter").replace("https://www.instagram", "https://instagram").replace("https://www.tiktok", "https://tiktok")
|
||||
|
||||
if "/i.imgur.com/" in url: url = url.replace(".png", ".webp").replace(".jpg", ".webp").replace(".jpeg", ".webp")
|
||||
elif "/media.giphy.com/" in url or "/c.tenor.com/" in url: url = url.replace(".gif", ".webp")
|
||||
elif "/i.ibb.com/" in url: url = url.replace(".png", ".webp").replace(".jpg", ".webp").replace(".jpeg", ".webp").replace(".gif", ".webp")
|
||||
|
||||
if url.startswith("https://streamable.com/") and not url.startswith("https://streamable.com/e/"): url = url.replace("https://streamable.com/", "https://streamable.com/e/")
|
||||
|
||||
parsed_url = urlparse(url)
|
||||
|
||||
domain = parsed_url.netloc
|
||||
if domain in ('old.reddit.com','twitter.com','instagram.com','tiktok.com'):
|
||||
new_url = ParseResult(scheme="https",
|
||||
netloc=parsed_url.netloc,
|
||||
path=parsed_url.path,
|
||||
params=parsed_url.params,
|
||||
query=None,
|
||||
fragment=parsed_url.fragment)
|
||||
else:
|
||||
qd = parse_qs(parsed_url.query)
|
||||
filtered = {k: val for k, val in qd.items() if not k.startswith('utm_') and not k.startswith('ref_')}
|
||||
|
||||
new_url = ParseResult(scheme="https",
|
||||
netloc=parsed_url.netloc,
|
||||
path=parsed_url.path,
|
||||
params=parsed_url.params,
|
||||
query=urlencode(filtered, doseq=True),
|
||||
fragment=parsed_url.fragment)
|
||||
|
||||
url = urlunparse(new_url)
|
||||
|
||||
url = urllib.parse.unparse(canonicalize_url2(url, httpsify=True))
|
||||
if url.endswith('/'): url = url[:-1]
|
||||
|
||||
search_url = sql_ilike_clean(url)
|
||||
|
@ -546,13 +451,100 @@ def api_is_repost():
|
|||
if repost: return {'permalink': repost.permalink}
|
||||
else: return {'permalink': ''}
|
||||
|
||||
|
||||
def _do_antispam_submission_check(v:User, validated:validators.ValidatedSubmissionLike):
|
||||
now = int(time.time())
|
||||
cutoff = now - 60 * 60 * 24
|
||||
|
||||
similar_posts = g.db.query(Submission).filter(
|
||||
Submission.author_id == v.id,
|
||||
Submission.title.op('<->')(validated.title) < app.config["SPAM_SIMILARITY_THRESHOLD"],
|
||||
Submission.created_utc > cutoff
|
||||
).all()
|
||||
|
||||
if validated.url:
|
||||
similar_urls = g.db.query(Submission).filter(
|
||||
Submission.author_id == v.id,
|
||||
Submission.url.op('<->')(validated.url) < app.config["SPAM_URL_SIMILARITY_THRESHOLD"],
|
||||
Submission.created_utc > cutoff
|
||||
).all()
|
||||
else:
|
||||
similar_urls = []
|
||||
|
||||
threshold = app.config["SPAM_SIMILAR_COUNT_THRESHOLD"]
|
||||
if v.age_seconds >= (60 * 60 * 24 * 7): threshold *= 3
|
||||
elif v.age_seconds >= (60 * 60 * 24): threshold *= 2
|
||||
|
||||
if max(len(similar_urls), len(similar_posts)) < threshold:
|
||||
return
|
||||
|
||||
text = "Your account has been banned for **1 day** for the following reason:\n\n> Too much spam!"
|
||||
send_repeatable_notification(v.id, text)
|
||||
|
||||
v.ban(reason="Spamming.", days=1)
|
||||
for post in similar_posts + similar_urls:
|
||||
post.is_banned = True
|
||||
post.is_pinned = False
|
||||
post.ban_reason = "AutoJanny"
|
||||
g.db.add(post)
|
||||
ma=ModAction(
|
||||
user_id=AUTOJANNY_ID,
|
||||
target_submission_id=post.id,
|
||||
kind="ban_post",
|
||||
_note="spam"
|
||||
)
|
||||
g.db.add(ma)
|
||||
g.db.commit()
|
||||
abort(403)
|
||||
|
||||
|
||||
def _execute_domain_ban_check(parsed_url:ParseResult):
|
||||
domain:str = parsed_url.netloc
|
||||
domain_obj = get_domain(domain)
|
||||
if not domain_obj:
|
||||
domain_obj = get_domain(domain+parsed_url.path)
|
||||
if not domain_obj: return
|
||||
abort(403, f"Remove the {domain_obj.domain} link from your post and try again. {domain_obj.reason}")
|
||||
|
||||
|
||||
def _duplicate_check(search_url:Optional[str]) -> Optional[werkzeug.wrappers.Response]:
|
||||
if not search_url: return None
|
||||
repost = g.db.query(Submission).filter(
|
||||
func.lower(Submission.url) == search_url.lower(),
|
||||
Submission.deleted_utc == 0,
|
||||
Submission.is_banned == False
|
||||
).first()
|
||||
if repost and SITE != 'localhost':
|
||||
return redirect(repost.permalink)
|
||||
return None
|
||||
|
||||
|
||||
def _duplicate_check2(
|
||||
user_id:int,
|
||||
validated_post:validators.ValidatedSubmissionLike) -> Optional[werkzeug.wrappers.Response]:
|
||||
dup = g.db.query(Submission).filter(
|
||||
Submission.author_id == user_id,
|
||||
Submission.deleted_utc == 0,
|
||||
Submission.title == validated_post.title,
|
||||
Submission.url == validated_post.url,
|
||||
Submission.body == validated_post.body
|
||||
).one_or_none()
|
||||
|
||||
if dup and SITE != 'localhost':
|
||||
return redirect(dup.permalink)
|
||||
return None
|
||||
|
||||
|
||||
@app.post("/submit")
|
||||
# @app.post("/h/<sub>/submit")
|
||||
@limiter.limit("1/second;2/minute;10/hour;50/day")
|
||||
@auth_required
|
||||
def submit_post(v, sub=None):
|
||||
|
||||
def error(error):
|
||||
title:str = request.values.get("title", "")
|
||||
body:str = request.values.get("body", "")
|
||||
url:str = request.values.get("url", "")
|
||||
|
||||
if request.headers.get("Authorization") or request.headers.get("xhr"): abort(400, error)
|
||||
|
||||
SUBS = [x[0] for x in g.db.query(Sub.name).order_by(Sub.name).all()]
|
||||
|
@ -560,13 +552,13 @@ def submit_post(v, sub=None):
|
|||
|
||||
if v.is_suspended: return error("You can't perform this action while banned.")
|
||||
|
||||
title = guarded_value("title", 1, MAX_TITLE_LENGTH)
|
||||
title = sanitize_raw(title, allow_newlines=False, length_limit=MAX_TITLE_LENGTH)
|
||||
|
||||
url = guarded_value("url", 0, MAX_URL_LENGTH)
|
||||
|
||||
body = guarded_value("body", 0, MAX_BODY_LENGTH)
|
||||
body = sanitize_raw(body, allow_newlines=True, length_limit=MAX_BODY_LENGTH)
|
||||
try:
|
||||
validated_post: validators.ValidatedSubmissionLike = \
|
||||
validators.ValidatedSubmissionLike.from_flask_request(request,
|
||||
allow_embedding=MULTIMEDIA_EMBEDDING_ENABLED,
|
||||
)
|
||||
except ValueError as e:
|
||||
return error(str(e))
|
||||
|
||||
sub = request.values.get("sub")
|
||||
if sub: sub = sub.replace('/h/','').replace('s/','')
|
||||
|
@ -578,171 +570,27 @@ def submit_post(v, sub=None):
|
|||
sub = sub[0]
|
||||
if v.exiled_from(sub): return error(f"You're exiled from /h/{sub}")
|
||||
else: sub = None
|
||||
|
||||
title_html = filter_emojis_only(title, graceful=True)
|
||||
|
||||
if len(title_html) > 1500: return error("Rendered title is too big!")
|
||||
duplicate:Optional[werkzeug.wrappers.Response] = \
|
||||
_duplicate_check(validated_post.repost_search_url)
|
||||
if duplicate: return duplicate
|
||||
|
||||
embed = None
|
||||
parsed_url:Optional[ParseResult] = validated_post.url_canonical
|
||||
if parsed_url:
|
||||
_execute_domain_ban_check(parsed_url)
|
||||
|
||||
if url:
|
||||
for rd in ("://reddit.com", "://new.reddit.com", "://www.reddit.com", "://redd.it", "://libredd.it", "://teddit.net"):
|
||||
url = url.replace(rd, "://old.reddit.com")
|
||||
duplicate:Optional[werkzeug.wrappers.Response] = \
|
||||
_duplicate_check2(v.id, validated_post)
|
||||
if duplicate: return duplicate
|
||||
|
||||
url = url.replace("nitter.net", "twitter.com").replace("old.reddit.com/gallery", "reddit.com/gallery").replace("https://youtu.be/", "https://youtube.com/watch?v=").replace("https://music.youtube.com/watch?v=", "https://youtube.com/watch?v=").replace("https://streamable.com/", "https://streamable.com/e/").replace("https://youtube.com/shorts/", "https://youtube.com/watch?v=").replace("https://mobile.twitter", "https://twitter").replace("https://m.facebook", "https://facebook").replace("m.wikipedia.org", "wikipedia.org").replace("https://m.youtube", "https://youtube").replace("https://www.youtube", "https://youtube").replace("https://www.twitter", "https://twitter").replace("https://www.instagram", "https://instagram").replace("https://www.tiktok", "https://tiktok")
|
||||
|
||||
if "/i.imgur.com/" in url: url = url.replace(".png", ".webp").replace(".jpg", ".webp").replace(".jpeg", ".webp")
|
||||
elif "/media.giphy.com/" in url or "/c.tenor.com/" in url: url = url.replace(".gif", ".webp")
|
||||
elif "/i.ibb.com/" in url: url = url.replace(".png", ".webp").replace(".jpg", ".webp").replace(".jpeg", ".webp").replace(".gif", ".webp")
|
||||
|
||||
if url.startswith("https://streamable.com/") and not url.startswith("https://streamable.com/e/"): url = url.replace("https://streamable.com/", "https://streamable.com/e/")
|
||||
|
||||
parsed_url = urlparse(url)
|
||||
|
||||
domain = parsed_url.netloc
|
||||
if domain in ('old.reddit.com','twitter.com','instagram.com','tiktok.com'):
|
||||
new_url = ParseResult(scheme="https",
|
||||
netloc=parsed_url.netloc,
|
||||
path=parsed_url.path,
|
||||
params=parsed_url.params,
|
||||
query=None,
|
||||
fragment=parsed_url.fragment)
|
||||
else:
|
||||
qd = parse_qs(parsed_url.query)
|
||||
filtered = {k: val for k, val in qd.items() if not k.startswith('utm_') and not k.startswith('ref_')}
|
||||
|
||||
new_url = ParseResult(scheme="https",
|
||||
netloc=parsed_url.netloc,
|
||||
path=parsed_url.path,
|
||||
params=parsed_url.params,
|
||||
query=urlencode(filtered, doseq=True),
|
||||
fragment=parsed_url.fragment)
|
||||
|
||||
search_url = urlunparse(new_url)
|
||||
|
||||
if search_url.endswith('/'): url = url[:-1]
|
||||
|
||||
repost = g.db.query(Submission).filter(
|
||||
func.lower(Submission.url) == search_url.lower(),
|
||||
Submission.deleted_utc == 0,
|
||||
Submission.is_banned == False
|
||||
).first()
|
||||
if repost and SITE != 'localhost': return redirect(repost.permalink)
|
||||
|
||||
domain_obj = get_domain(domain)
|
||||
if not domain_obj: domain_obj = get_domain(domain+parsed_url.path)
|
||||
|
||||
if domain_obj:
|
||||
reason = f"Remove the {domain_obj.domain} link from your post and try again. {domain_obj.reason}"
|
||||
return error(reason)
|
||||
elif "twitter.com" == domain:
|
||||
try: embed = requests.get("https://publish.twitter.com/oembed", params={"url":url, "omit_script":"t"}, timeout=5).json()["html"]
|
||||
except: pass
|
||||
elif url.startswith('https://youtube.com/watch?v='):
|
||||
url = unquote(url).replace('?t', '&t')
|
||||
yt_id = url.split('https://youtube.com/watch?v=')[1].split('&')[0].split('%')[0]
|
||||
|
||||
if yt_id_regex.fullmatch(yt_id):
|
||||
req = requests.get(f"https://www.googleapis.com/youtube/v3/videos?id={yt_id}&key={YOUTUBE_KEY}&part=contentDetails", timeout=5).json()
|
||||
if req.get('items'):
|
||||
params = parse_qs(urlparse(url).query)
|
||||
t = params.get('t', params.get('start', [0]))[0]
|
||||
if isinstance(t, str): t = t.replace('s','')
|
||||
|
||||
embed = f'<lite-youtube videoid="{yt_id}" params="autoplay=1&modestbranding=1'
|
||||
if t:
|
||||
try: embed += f'&start={int(t)}'
|
||||
except: pass
|
||||
embed += '"></lite-youtube>'
|
||||
|
||||
elif app.config['SERVER_NAME'] in domain and "/post/" in url and "context" not in url:
|
||||
id = url.split("/post/")[1]
|
||||
if "/" in id: id = id.split("/")[0]
|
||||
embed = str(int(id))
|
||||
|
||||
|
||||
if not url and not body and not request.files.get("file") and not request.files.get("file2"):
|
||||
return error("Please enter a url or some text.")
|
||||
|
||||
dup = g.db.query(Submission).filter(
|
||||
Submission.author_id == v.id,
|
||||
Submission.deleted_utc == 0,
|
||||
Submission.title == title,
|
||||
Submission.url == url,
|
||||
Submission.body == body
|
||||
).one_or_none()
|
||||
|
||||
if dup and SITE != 'localhost': return redirect(dup.permalink)
|
||||
|
||||
now = int(time.time())
|
||||
cutoff = now - 60 * 60 * 24
|
||||
|
||||
|
||||
similar_posts = g.db.query(Submission).filter(
|
||||
Submission.author_id == v.id,
|
||||
Submission.title.op('<->')(title) < app.config["SPAM_SIMILARITY_THRESHOLD"],
|
||||
Submission.created_utc > cutoff
|
||||
).all()
|
||||
|
||||
if url:
|
||||
similar_urls = g.db.query(Submission).filter(
|
||||
Submission.author_id == v.id,
|
||||
Submission.url.op('<->')(url) < app.config["SPAM_URL_SIMILARITY_THRESHOLD"],
|
||||
Submission.created_utc > cutoff
|
||||
).all()
|
||||
else: similar_urls = []
|
||||
|
||||
threshold = app.config["SPAM_SIMILAR_COUNT_THRESHOLD"]
|
||||
if v.age >= (60 * 60 * 24 * 7): threshold *= 3
|
||||
elif v.age >= (60 * 60 * 24): threshold *= 2
|
||||
|
||||
if max(len(similar_urls), len(similar_posts)) >= threshold:
|
||||
|
||||
text = "Your account has been banned for **1 day** for the following reason:\n\n> Too much spam!"
|
||||
send_repeatable_notification(v.id, text)
|
||||
|
||||
v.ban(reason="Spamming.",
|
||||
days=1)
|
||||
|
||||
for post in similar_posts + similar_urls:
|
||||
post.is_banned = True
|
||||
post.is_pinned = False
|
||||
post.ban_reason = "AutoJanny"
|
||||
g.db.add(post)
|
||||
ma=ModAction(
|
||||
user_id=AUTOJANNY_ID,
|
||||
target_submission_id=post.id,
|
||||
kind="ban_post",
|
||||
_note="spam"
|
||||
)
|
||||
g.db.add(ma)
|
||||
return redirect("/notifications")
|
||||
|
||||
if request.files.get("file2") and request.headers.get("cf-ipcountry") != "T1":
|
||||
files = request.files.getlist('file2')[:4]
|
||||
for file in files:
|
||||
if file.content_type.startswith('image/'):
|
||||
name = f'/images/{time.time()}'.replace('.','') + '.webp'
|
||||
file.save(name)
|
||||
image = process_image(name)
|
||||
if app.config['MULTIMEDIA_EMBEDDING_ENABLED']:
|
||||
body += f"\n\n"
|
||||
else:
|
||||
body += f'\n\n<a href="{image}">{image}</a>'
|
||||
else:
|
||||
return error("Image files only")
|
||||
|
||||
body_html = sanitize(body)
|
||||
_do_antispam_submission_check(v, validated_post)
|
||||
|
||||
club = bool(request.values.get("club",""))
|
||||
|
||||
if embed and len(embed) > 1500: embed = None
|
||||
|
||||
is_bot = bool(request.headers.get("Authorization"))
|
||||
|
||||
# Invariant: these values are guarded and obey the length bound
|
||||
assert len(title) <= MAX_TITLE_LENGTH
|
||||
assert len(body) <= MAX_BODY_LENGTH
|
||||
assert len(validated_post.title) <= MAX_TITLE_LENGTH
|
||||
assert len(validated_post.body) <= SUBMISSION_BODY_LENGTH_MAXIMUM
|
||||
|
||||
post = Submission(
|
||||
private=bool(request.values.get("private","")),
|
||||
|
@ -750,75 +598,28 @@ def submit_post(v, sub=None):
|
|||
author_id=v.id,
|
||||
over_18=bool(request.values.get("over_18","")),
|
||||
app_id=v.client.application.id if v.client else None,
|
||||
is_bot = is_bot,
|
||||
url=url,
|
||||
body=body,
|
||||
body_html=body_html,
|
||||
embed_url=embed,
|
||||
title=title,
|
||||
title_html=title_html,
|
||||
is_bot=is_bot,
|
||||
url=validated_post.url,
|
||||
body=validated_post.body,
|
||||
body_html=validated_post.body_html,
|
||||
embed_url=validated_post.embed_slow,
|
||||
title=validated_post.title,
|
||||
title_html=validated_post.title_html,
|
||||
sub=sub,
|
||||
ghost=False,
|
||||
filter_state='filtered' if v.admin_level == 0 and app.config['SETTINGS']['FilterNewPosts'] else 'normal'
|
||||
filter_state='filtered' if v.admin_level == 0 and app.config['SETTINGS']['FilterNewPosts'] else 'normal',
|
||||
thumburl=validated_post.thumburl
|
||||
)
|
||||
|
||||
g.db.add(post)
|
||||
g.db.flush()
|
||||
|
||||
vote = Vote(user_id=v.id,
|
||||
vote_type=1,
|
||||
submission_id=post.id
|
||||
)
|
||||
g.db.add(vote)
|
||||
|
||||
if request.files.get('file') and request.headers.get("cf-ipcountry") != "T1":
|
||||
|
||||
file = request.files['file']
|
||||
|
||||
if file.content_type.startswith('image/'):
|
||||
name = f'/images/{time.time()}'.replace('.','') + '.webp'
|
||||
file.save(name)
|
||||
post.url = process_image(name)
|
||||
|
||||
name2 = name.replace('.webp', 'r.webp')
|
||||
copyfile(name, name2)
|
||||
post.thumburl = process_image(name2, resize=100)
|
||||
else:
|
||||
return error("Image files only")
|
||||
post.submit(g.db)
|
||||
|
||||
if not post.thumburl and post.url:
|
||||
gevent.spawn(thumbnail_thread, post.id)
|
||||
|
||||
if not post.private and not post.ghost:
|
||||
|
||||
notify_users = NOTIFY_USERS(f'{title} {body}', v)
|
||||
|
||||
if notify_users:
|
||||
cid = notif_comment2(post)
|
||||
for x in notify_users:
|
||||
add_notif(cid, x)
|
||||
|
||||
if (request.values.get('followers') or is_bot) and v.followers:
|
||||
text = f"@{v.username} has made a new post: [{post.title}]({post.shortlink})"
|
||||
if post.sub: text += f" in <a href='/h/{post.sub}'>/h/{post.sub}"
|
||||
|
||||
cid = notif_comment(text, autojanny=True)
|
||||
for follow in v.followers:
|
||||
user = get_account(follow.user_id)
|
||||
if post.club and not user.paid_dues: continue
|
||||
add_notif(cid, user.id)
|
||||
|
||||
v.post_count = g.db.query(Submission.id).filter_by(author_id=v.id, is_banned=False, deleted_utc=0).count()
|
||||
g.db.add(v)
|
||||
post.publish()
|
||||
g.db.commit()
|
||||
|
||||
cache.delete_memoized(frontlist)
|
||||
cache.delete_memoized(User.userpagelisting)
|
||||
|
||||
if v.admin_level > 0 and ("[changelog]" in post.title.lower() or "(changelog)" in post.title.lower()) and not post.private:
|
||||
cache.delete_memoized(changeloglist)
|
||||
|
||||
if request.headers.get("Authorization"): return post.json
|
||||
if request.headers.get("Authorization"):
|
||||
return post.json
|
||||
else:
|
||||
post.voted = 1
|
||||
if 'megathread' in post.title.lower(): sort = 'new'
|
||||
|
@ -830,7 +631,6 @@ def submit_post(v, sub=None):
|
|||
@limiter.limit("1/second;30/minute;200/hour;1000/day")
|
||||
@auth_required
|
||||
def delete_post_pid(pid, v):
|
||||
|
||||
post = get_post(pid)
|
||||
if post.author_id != v.id:
|
||||
abort(403)
|
||||
|
@ -841,7 +641,7 @@ def delete_post_pid(pid, v):
|
|||
|
||||
g.db.add(post)
|
||||
|
||||
cache.delete_memoized(frontlist)
|
||||
invalidate_cache(frontlist=True)
|
||||
|
||||
g.db.commit()
|
||||
|
||||
|
@ -853,10 +653,11 @@ def delete_post_pid(pid, v):
|
|||
def undelete_post_pid(pid, v):
|
||||
post = get_post(pid)
|
||||
if post.author_id != v.id: abort(403)
|
||||
post.deleted_utc =0
|
||||
post.deleted_utc = 0
|
||||
|
||||
g.db.add(post)
|
||||
|
||||
cache.delete_memoized(frontlist)
|
||||
invalidate_cache(frontlist=True)
|
||||
|
||||
g.db.commit()
|
||||
|
||||
|
|
|
@ -1,9 +1,10 @@
|
|||
from files.helpers.wrappers import *
|
||||
from files.helpers.get import *
|
||||
from flask import g
|
||||
|
||||
from files.__main__ import app, limiter
|
||||
from os import path
|
||||
from files.helpers.get import *
|
||||
from files.helpers.sanitize import filter_emojis_only
|
||||
from files.helpers.wrappers import *
|
||||
|
||||
|
||||
@app.post("/report/post/<pid>")
|
||||
@limiter.limit("1/second;30/minute;200/hour;1000/day")
|
||||
|
|
|
@ -1,11 +1,10 @@
|
|||
from files.helpers.wrappers import *
|
||||
import re
|
||||
from sqlalchemy import *
|
||||
from flask import *
|
||||
|
||||
from files.__main__ import app
|
||||
from files.helpers.contentsorting import apply_time_filter, sort_objects
|
||||
from files.helpers.strings import sql_ilike_clean
|
||||
|
||||
from files.helpers.wrappers import *
|
||||
from files.routes.importstar import *
|
||||
|
||||
valid_params=[
|
||||
'author',
|
||||
|
|
|
@ -1,15 +1,14 @@
|
|||
import os
|
||||
from shutil import copyfile
|
||||
|
||||
from files.__main__ import app, limiter
|
||||
from files.helpers.alerts import *
|
||||
from files.helpers.caching import invalidate_cache
|
||||
from files.helpers.config.const import *
|
||||
from files.helpers.media import process_image
|
||||
from files.helpers.sanitize import *
|
||||
from files.helpers.const import *
|
||||
from files.mail import *
|
||||
from files.__main__ import app, cache, limiter
|
||||
from .front import frontlist
|
||||
import os
|
||||
from files.helpers.sanitize import filter_emojis_only
|
||||
from files.helpers.strings import sql_ilike_clean
|
||||
from shutil import copyfile
|
||||
import requests
|
||||
from files.mail import *
|
||||
|
||||
tiers={
|
||||
"(Paypig)": 1,
|
||||
|
@ -186,7 +185,7 @@ def settings_profile_post(v):
|
|||
if frontsize in {"15", "25", "50", "100"}:
|
||||
v.frontsize = int(frontsize)
|
||||
updated = True
|
||||
cache.delete_memoized(frontlist)
|
||||
invalidate_cache(frontlist=True)
|
||||
else: abort(400)
|
||||
|
||||
defaultsortingcomments = request.values.get("defaultsortingcomments")
|
||||
|
@ -263,7 +262,7 @@ def changelogsub(v):
|
|||
v.changelogsub = not v.changelogsub
|
||||
g.db.add(v)
|
||||
|
||||
cache.delete_memoized(frontlist)
|
||||
invalidate_cache(frontlist=True)
|
||||
|
||||
g.db.commit()
|
||||
if v.changelogsub: return {"message": "You have subscribed to the changelog!"}
|
||||
|
@ -542,7 +541,7 @@ def settings_block_user(v):
|
|||
target_id=user.id,
|
||||
)
|
||||
g.db.add(new_block)
|
||||
cache.delete_memoized(frontlist)
|
||||
invalidate_cache(frontlist=True)
|
||||
g.db.commit()
|
||||
|
||||
return {"message": f"@{user.username} blocked."}
|
||||
|
@ -556,7 +555,7 @@ def settings_unblock_user(v):
|
|||
x = v.is_blocking(user)
|
||||
if not x: abort(409)
|
||||
g.db.delete(x)
|
||||
cache.delete_memoized(frontlist)
|
||||
invalidate_cache(frontlist=True)
|
||||
g.db.commit()
|
||||
|
||||
return {"message": f"@{user.username} unblocked."}
|
||||
|
|
|
@ -1,17 +1,19 @@
|
|||
import calendar
|
||||
|
||||
import matplotlib.pyplot as plt
|
||||
from sqlalchemy import func
|
||||
|
||||
from files.classes.award import AWARDS
|
||||
from files.classes.badges import BadgeDef
|
||||
from files.classes.mod_logs import ACTIONTYPES, ACTIONTYPES2
|
||||
from files.helpers.alerts import *
|
||||
from files.helpers.captcha import validate_captcha
|
||||
from files.helpers.config.const import *
|
||||
from files.helpers.config.environment import HCAPTCHA_SECRET, HCAPTCHA_SITEKEY
|
||||
from files.helpers.media import process_image
|
||||
from files.mail import *
|
||||
from files.__main__ import app, limiter, mail
|
||||
from files.helpers.alerts import *
|
||||
from files.helpers.const import *
|
||||
from files.helpers.captcha import validate_captcha
|
||||
from files.classes.award import AWARDS
|
||||
from sqlalchemy import func
|
||||
from os import path
|
||||
import calendar
|
||||
import matplotlib.pyplot as plt
|
||||
from files.classes.mod_logs import ACTIONTYPES, ACTIONTYPES2
|
||||
from files.classes.badges import BadgeDef
|
||||
import logging
|
||||
from files.__main__ import app, cache, limiter # violates isort but used to prevent getting shadowed
|
||||
|
||||
|
||||
@app.get('/logged_out/')
|
||||
@app.get('/logged_out/<path:old>')
|
||||
|
@ -31,13 +33,6 @@ def logged_out(old = ""):
|
|||
|
||||
return redirect(redirect_url)
|
||||
|
||||
@app.get("/marsey_list")
|
||||
@cache.memoize(timeout=600, make_name=make_name)
|
||||
def marsey_list():
|
||||
marseys = [f"{x.name} : {x.tags}" for x in g.db.query(Marsey).order_by(Marsey.count.desc())]
|
||||
|
||||
return str(marseys).replace("'",'"')
|
||||
|
||||
@app.get('/sidebar')
|
||||
@auth_desired
|
||||
def sidebar(v):
|
||||
|
@ -280,15 +275,13 @@ def api(v):
|
|||
@app.get("/media")
|
||||
@auth_desired
|
||||
def contact(v):
|
||||
return render_template("contact.html", v=v,
|
||||
hcaptcha=app.config.get("HCAPTCHA_SITEKEY", ""))
|
||||
return render_template("contact.html", v=v, hcaptcha=HCAPTCHA_SITEKEY)
|
||||
|
||||
@app.post("/send_admin")
|
||||
@limiter.limit("1/second;2/minute;6/hour;10/day")
|
||||
@auth_desired
|
||||
def submit_contact(v: Optional[User]):
|
||||
if not v and not validate_captcha(app.config.get("HCAPTCHA_SECRET", ""),
|
||||
app.config.get("HCAPTCHA_SITEKEY", ""),
|
||||
if not v and not validate_captcha(HCAPTCHA_SECRET, HCAPTCHA_SITEKEY,
|
||||
request.values.get("h-captcha-response", "")):
|
||||
abort(403, "CAPTCHA provided was not correct. Please try it again")
|
||||
body = request.values.get("message")
|
||||
|
|
|
@ -1,27 +1,33 @@
|
|||
import qrcode
|
||||
import io
|
||||
import time
|
||||
import math
|
||||
import time
|
||||
from collections import Counter
|
||||
from urllib.parse import urlparse
|
||||
|
||||
from files.classes.leaderboard import SimpleLeaderboard, BadgeMarseyLeaderboard, UserBlockLeaderboard, LeaderboardMeta
|
||||
import gevent
|
||||
import qrcode
|
||||
|
||||
import files.helpers.listing as listings
|
||||
from files.__main__ import app, cache, limiter
|
||||
from files.classes.leaderboard import (BadgeMarseyLeaderboard, LeaderboardMeta,
|
||||
SimpleLeaderboard, UserBlockLeaderboard)
|
||||
from files.classes.views import ViewerRelationship
|
||||
from files.helpers.alerts import *
|
||||
from files.helpers.assetcache import assetcache_path
|
||||
from files.helpers.config.const import *
|
||||
from files.helpers.contentsorting import apply_time_filter, sort_objects
|
||||
from files.helpers.media import process_image
|
||||
from files.helpers.sanitize import *
|
||||
from files.helpers.strings import sql_ilike_clean
|
||||
from files.helpers.const import *
|
||||
from files.helpers.assetcache import assetcache_path
|
||||
from files.helpers.contentsorting import apply_time_filter, sort_objects
|
||||
from files.mail import *
|
||||
from flask import *
|
||||
from files.__main__ import app, limiter
|
||||
from collections import Counter
|
||||
import gevent
|
||||
from files.routes.importstar import *
|
||||
|
||||
|
||||
# warning: do not move currently. these have import-time side effects but
|
||||
# until this is refactored to be not completely awful, there's not really
|
||||
# a better option.
|
||||
from files.helpers.services import *
|
||||
from files.helpers.services import *
|
||||
|
||||
|
||||
@app.get("/@<username>/upvoters/<uid>/posts")
|
||||
@admin_level_required(3)
|
||||
|
@ -673,7 +679,7 @@ def u_username(username, v=None):
|
|||
try: page = max(int(request.values.get("page", 1)), 1)
|
||||
except: page = 1
|
||||
|
||||
ids = u.userpagelisting(site=SITE, v=v, page=page, sort=sort, t=t)
|
||||
ids = listings.userpagelisting(u, site=SITE, v=v, page=page, sort=sort, t=t)
|
||||
|
||||
next_exists = (len(ids) > 25)
|
||||
ids = ids[:25]
|
||||
|
@ -862,9 +868,6 @@ def remove_follow(username, v):
|
|||
|
||||
return {"message": "Follower removed!"}
|
||||
|
||||
from urllib.parse import urlparse
|
||||
import re
|
||||
|
||||
@app.get("/pp/<int:id>")
|
||||
@app.get("/uid/<int:id>/pic")
|
||||
@app.get("/uid/<int:id>/pic/profile")
|
||||
|
|
|
@ -1,18 +1,16 @@
|
|||
|
||||
from datetime import datetime, timedelta
|
||||
from typing import Optional
|
||||
|
||||
import sqlalchemy
|
||||
from flask import abort, g, render_template, request
|
||||
|
||||
import files.helpers.jinja2
|
||||
import files.routes.volunteer_janitor
|
||||
from files.__main__ import app
|
||||
from files.classes.user import User
|
||||
import files.helpers.jinja2
|
||||
from files.helpers.wrappers import auth_required
|
||||
from files.routes.volunteer_common import VolunteerDuty
|
||||
import files.routes.volunteer_janitor
|
||||
from flask import abort, render_template, g, request
|
||||
from os import environ
|
||||
import sqlalchemy
|
||||
from typing import Optional
|
||||
import pprint
|
||||
|
||||
|
||||
|
||||
|
||||
@files.helpers.jinja2.template_function
|
||||
|
|
|
@ -1,10 +1,13 @@
|
|||
from files.helpers.wrappers import *
|
||||
from files.helpers.get import *
|
||||
from files.helpers.const import *
|
||||
from files.classes import *
|
||||
from flask import *
|
||||
from files.__main__ import app, limiter, cache
|
||||
from os import environ
|
||||
from files.__main__ import app, limiter
|
||||
from files.classes.comment import Comment
|
||||
from files.classes.submission import Submission
|
||||
from files.classes.votes import CommentVote, Vote
|
||||
from files.helpers.config.const import OWNER_ID
|
||||
from files.helpers.config.environment import ENABLE_DOWNVOTES
|
||||
from files.helpers.get import get_comment, get_post
|
||||
from files.helpers.wrappers import admin_level_required, is_not_permabanned
|
||||
from files.routes.importstar import *
|
||||
|
||||
|
||||
@app.get("/votes")
|
||||
@limiter.exempt
|
||||
|
@ -14,8 +17,8 @@ def admin_vote_info_get(v):
|
|||
if not link: return render_template("votes.html", v=v)
|
||||
|
||||
try:
|
||||
if "t2_" in link: thing = get_post(int(link.split("t2_")[1]), v=v)
|
||||
elif "t3_" in link: thing = get_comment(int(link.split("t3_")[1]), v=v)
|
||||
if "t2_" in link: thing = get_post(link.split("t2_")[1], v=v)
|
||||
elif "t3_" in link: thing = get_comment(link.split("t3_")[1], v=v)
|
||||
else: abort(400)
|
||||
except: abort(400)
|
||||
|
||||
|
@ -54,11 +57,11 @@ def admin_vote_info_get(v):
|
|||
@is_not_permabanned
|
||||
def api_vote_post(post_id, new, v):
|
||||
|
||||
# make sure we're allowed in (is this really necessary? I'm not sure)
|
||||
# make sure this account is not a bot
|
||||
if request.headers.get("Authorization"): abort(403)
|
||||
|
||||
# make sure new is valid
|
||||
if new == "-1" and environ.get('DISABLE_DOWNVOTES') == '1': abort(403, "forbidden.")
|
||||
if new == "-1" and not ENABLE_DOWNVOTES: abort(403)
|
||||
if new not in ["-1", "0", "1"]: abort(400)
|
||||
new = int(new)
|
||||
|
||||
|
@ -122,11 +125,11 @@ def api_vote_post(post_id, new, v):
|
|||
@is_not_permabanned
|
||||
def api_vote_comment(comment_id, new, v):
|
||||
|
||||
# make sure we're allowed in (is this really necessary? I'm not sure)
|
||||
# make sure this account is not a bot
|
||||
if request.headers.get("Authorization"): abort(403)
|
||||
|
||||
# make sure new is valid
|
||||
if new == "-1" and environ.get('DISABLE_DOWNVOTES') == '1': abort(403, "forbidden.")
|
||||
if new == "-1" and not ENABLE_DOWNVOTES: abort(403)
|
||||
if new not in ["-1", "0", "1"]: abort(400)
|
||||
new = int(new)
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue