Donmai

Danbooru changelog discussion thread

Posted under Bugs & Features

Claverhouse said:

What precious mincing nonsense. Pale Moon is frequently updated, and I will sooner give up the internet than use the filthy mess caused by Google's mania to control the whole of it ( helped by Microsoft's original sin of monopoly ). The Browser Monoculture by causing stagnation just as must as in the bad old days of IE6 is weakening the Internet's strength and resilience as much as the use of CDNs and the concentration --- by Google Search's power to dictate which sites people go to --- into a small number of giant sites.

That has nothing to do with JavaScript. Pale Moon is developed by someone who's very opinionated and refuses to support new features of the language because they think they know better, so it will always be an inferior choice compared to the rest.

Should be working again, although unhiding hidden comments will still be broken on these browsers.

Danbooru only officially supports mainstream browsers (Firefox/Chrome/Safari) released in the last 5 years. Other browsers usually work but are not officially supported. Please spare me the lectures about muh modern webdev bad, muh Chrome monoculture bad; I use Firefox myself and I already go out of my way to support older browsers.

@evazion: stuff is now unbroken for me, yay~ Thanks!

In the case of alternate browsers, particularly SeaMonkey, progress is slow due to the lack of manpower... but they're still trying! (FWIW, SeaMonkey 2.53.11 just released last Monday)

EDIT: Oh, found the relevant Git commit notes. Yeeeeah, we will be waiting for quite some time to get ES2022 syntax updates on our non-mainstream browsers, as backporting stuff is not easy (we're still dealing with WebComponents pain as the spec is insanely complex and a very moving target!)

Updated

nonamethanks said:

That has nothing to do with JavaScript. Pale Moon is developed by someone who's very opinionated and refuses to support new features of the language because they think they know better, so it will always be an inferior choice compared to the rest.

Umm, yeah... Actually it has everything to do with the problems. Google, the dominant, is notorious for ( again like earlier Microsoft )
a/ ignoring [ web ] standards;
b/ going ahead with their own;
c/ trying to substitute their own as the agreed standards.

d/ abandoning their project after a few years.

This, combined with the ever-increasing weight of web-pages, and more importantly still, the ever more complex superstructure, especially that of javascript --- in ever less needed 'improvements' --- means even with what would have been unimaginable speeds 10 years ago, pages are so cumbersome, things can still be slow.

As for the developer of Pale Moon, I am just grateful to him for providing a better browser; no-one ever criticised Gates or Jobs for their vile tempers...

As with Linus, maybe it goes with the job.

Recently (like 15 mins ago) starting encountering this error while trying to upload, don't know if it was just me:

Unexpected error: Seahorse::Client::NetworkingError.

Details
Seahorse::Client::NetworkingError exception raised
app/logical/sqs_service.rb:28:in `send_message'
app/models/post_version.rb:100:in `queue'
app/models/post.rb:865:in `create_new_version'
app/models/post.rb:850:in `create_version'
app/models/application_record.rb:180:in `save_if_unique'
app/controllers/posts_controller.rb:69:in `create'

NinjaPope said:

Recently (like 15 mins ago) starting encountering this error while trying to upload, don't know if it was just me:

Unexpected error: Seahorse::Client::NetworkingError.

Details
Seahorse::Client::NetworkingError exception raised
app/logical/sqs_service.rb:28:in `send_message'
app/models/post_version.rb:100:in `queue'
app/models/post.rb:865:in `create_new_version'
app/models/post.rb:850:in `create_version'
app/models/application_record.rb:180:in `save_if_unique'
app/controllers/posts_controller.rb:69:in `create'

It's everyone, we can't do anything but wait.

Artists: when adding a URL to an artist entry, the URL will be automatically standardized to the right format if possible.

please save the user's actual input in the artist history, I don't trust your filters.

whatever's changing "blog123.domain" to "blog.domain" is particularly bad... there's no guarantee that link even works, and all the wayback archives are under the blog123 address.

URLs are normalized because I don't trust the user's actual input. I've spent the last week or so manually fixing hundreds if not thousands of bad URLs in artist entries. Every mistake you can imagine has been made, and some you can't.

This is necessary for artist finding to work. If a post has a source like https://blog-imgs-91-origin.fc2.com/l/a/w/lawpula/3155822-image5.jpg, then we know it's from http://lawpula.blog.fc2.com, which matches the artist tngd. We don't know that it's from http://lawpula.blog87.fc2.com. If we put http://lawpula.blog87.fc2.com in the artist entry, then artist finding wouldn't work. There's no way to get from https://blog-imgs-91-origin.fc2.com/l/a/w/lawpula/3155822-image5.jpg to http://lawpula.blog87.fc2.com.

Previously we dealt with this by secretly normalizing the URL in a hidden field. This was confusing because there were times that artist finding didn't work, and it wasn't clear why. The actual URL we used for artist finding wasn't the one you entered.

Also there were lots of problems before with artist entries containing duplicate URLs. We'd get entries with some combination of this:

Which are all equivalent, but you don't know that until you normalize them all to one form.

1 13 14 15 16 17 18 19 20 21 44