wertigon |
Posted on 19-06-02, 20:34
|
Post: #38 of 205
Since: 11-24-18 Last post: 175 days Last view: 1 day |
Re: slow piece of crap hogs of website excuses for crapdog fests, I agree things *should* be more streamlined. After all, reddit - it's mostly text and images, amirite? Yes, but modern websites *are* one of the most complicated pieces to write these days. First off, you need four or five different expertise skills: * UI Design (Photoshop) * Front-End development (HTML, CSS and JavaScript) * Back-End development (One of C#, PHP, JavaScript, Python or Ruby) * Database development (Usually SQL of some sort) * Server administration (Webserver, database and webapp environment) Each of these layers require some glue connecting them together: * The UI designer chops up the design in PNG and SVG files * The Front End developer uses templates that describe how to render the HTML and CSS required for the JavaScript to work. * The Back End developer uses the templates, as well as a REST-based JSON API to communicate with Front-End and SQL on the back-end * The Database developer uses SQL. No really, that's all he or she does. * The server admin, finally... Well, doesn't do much once everything is set up and logging properly I suppose. So that's at least 9 exchanges in the chain where efficiency gets lost. It's 9 gears and each lose some efficiency. If we assume each translation lose 5% efficiency that means around 63% efficiency losses just from having 9 freaking layers. Add to that that for each layer, there is network communication involved, not to mention the modern browser engine has to render as much as win32 apps on the screen, well... No wonder things are big and slow and bloated. How to fix this? I have no fricken clue... Disclaimer: IAAFSD (I Am A Full Stack Developer) |
Duck Penis |
Posted on 19-06-02, 22:22
|
Stirrer of Shit
Post: #354 of 717 Since: 01-26-19 Last post: 1783 days Last view: 1781 days |
Why does it have to be this way, though? The old design had none of these issues. Sure, it looked old, but the other stuff was fine. They could just have changed the theme (see: CSS Zen Garden) and been done with it. There is no reason at all for it to use that much resources. The back-end's inefficiency has nothing to do with the rest. If they write their code in slow Python, their loss. The website can still be blazing fast for the end user. If they did write it in C/C++/Rust, bandwidth would very quickly become the limiting factor, but the issue is that the site is slow for the users. Which leaves UI design and frontend dev as the possible culprits. UI design is just making some image files. You could do it with pen and paper. We did in school, in fact. Worked fine. Sure, it was cumbersome, but as a thought experiment. My point is, those image files are just for getting looked at, not for actual use* in the final product. *of course, some stuff like the up/downvote arrows is used, but that's hardly what's making reddit slow. Front-end is making a website that looks as close to the design as possible. In other words, the design has no bearing on performance, unless it stipulates, say, "the website has to do a 3d cube transform when it opens a comment". Then this front-end design gets chopped up into some templates, so it can put in stuff like "wertigon," "19-06-02, 08:34 pm," etc. Then, optionally, you throw in some JavaScript to make the bold buttons etc work. But you could do without it. None of these are inherently slow. This forum is fast. Which raises the question, just why are modern websites "one of the most complicated pieces to write these days"? Just why do you need a REST-based JSON API to accomplish things that good old PHP did much faster ten years ago? Just why do you need to enable JavaScript for things that could be done without it? Whenever you ask people about this, the answer is that the websites have to look good. But this is a complete non-sequitur! I mean, it would not be difficult to re-theme this board to look exactly like Disc There was a certain photograph about which you had a hallucination. You believed that you had actually held it in your hands. It was a photograph something like this. |
Duck Penis |
Posted on 19-06-02, 22:54
|
Stirrer of Shit
Post: #355 of 717 Since: 01-26-19 Last post: 1783 days Last view: 1781 days |
Here, discount
There was a certain photograph about which you had a hallucination. You believed that you had actually held it in your hands. It was a photograph something like this. |
Kawaoneechan |
Posted on 19-06-02, 22:57
|
Ensemble Darkpony
Post: #249 of 599 Since: 10-29-18 Last post: 215 days Last view: 3 hours |
Fun fact: there was an attempt by me, years back, to write yet another AcmlmBoard. From scratch, like ABXD, but completely different. It had a separate mobile mode with large buttons, and I believe it was visually somewhat inspired by Disgust. And its name was |
CaptainJistuce |
Posted on 19-06-03, 01:00
|
Custom title here
Post: #487 of 1164 Since: 10-30-18 Last post: 83 days Last view: 4 hours |
I approve of the use of delta. --- In UTF-16, where available. --- |
wertigon |
Posted on 19-06-03, 17:58 (revision 2)
|
Post: #39 of 205
Since: 11-24-18 Last post: 175 days Last view: 1 day |
Posted by sureanem I agree with a lot of what you are saying. The basic gist of it is that very few people understand how CSS and HTML are supposed to work together, and refuse to use JavaScript in a sensible manner. Oh and the whole desktop-vs-mobile web. That, and they want on-the-fly JS loading stuff. For instance, I tend to frequent the Level1 forums these days. They have a lot of cool stuff like an auto-updating editor that shows an automatic preview of what your post will look like, entirely JS driven. Not to mention these fancy scroll-down-for-more-content-loading thingie. Or links loading either previews or, in case of sites like Youtube, loads the movie frame. The most sane thing I ever designed was a PHP mechanism that allowed me to load parts of a website. Basically, each template had a choice to either render in full, or only render the relevant parts. If you clicked the link with a JSON-call, it only loaded the relevant part. If you did not, you reloaded the entire site. This required a slightly ugly quirk for every template to be like this: But otherwise worked like a charm. Of course, a JSON-driven JavaScript loader would have been even better but... |
Kawaoneechan |
Posted on 19-06-03, 18:52
|
Just... a litl creacher
Post: #250 of 599 Since: 10-29-18 Last post: 215 days Last view: 3 hours |
Here's what the "discount Discourse" thing'd look like, in case anyone's too lazy to edit the page: |
Duck Penis |
Posted on 19-06-03, 23:05
|
Stirrer of Shit
Post: #358 of 717 Since: 01-26-19 Last post: 1783 days Last view: 1781 days |
What do you mean by desktop-vs-mobile web? These issues are far worse on phones, since you get a more primitive UI and worse performance. For instance, developers doing odd things with the zoom feature isn't a big deal on computers. But on a phone, it severely hurts usability if the page freezes up for a good second whenever you zoom in or out. I mean, it all comes back to "web developers are stupid," because, to be blunt, "understanding how CSS and HTML are supposed to work together" and "using JavaScript in a sensible manner" constitute good chunks of their job description. It's like claiming someone is a great translator except for the part where they only speak English. That person would be considered grossly incompetent and a fraud, who'd promptly get fired within a matter of seconds. But for web developers, this is just... normal? I really don't intend to rag on your profession, but it does seem like you're implying that a sizable majority of them are too incompetent to carry out their job. This sounds very bad. Posted by wertigon That's reasonable. Degrades nicely and doesn't have very high performance costs. I don't like dynamic loading on forums (just set the default post limit to something like 500), but it doesn't have a great performance impact. If done correctly. Same for previews, hitting a button to do it isn't really hard work. But even doing it directly in the browser has near-zero performance cost. Discourse has the editor (botched, it sends a HTTP request each letter you type, so there's a delay of about a second), dynamic loading (completely botched, scrolling either up or down causes you to wait for several seconds, and also breaks search, printing, and presumably saving pages). Reddit has dynamic loading, which is the only thing that isn't broken on that site. Why does the end result always end up so bad? It can be done without being intrinsically bad. Editor I don't know of any examples of, but 4chan does dynamic loading without breaking a sweat. The politically incorrect answer is that 4chan wasn't designed by a so-called "professional web developer," and thus automatically works 100x better. But can this really be the reason why modern websites are so bad? It seems overly simplistic, even by my standards. The most sane thing I ever designed was a PHP mechanism that allowed me to load parts of a website. Basically, each template had a choice to either render in full, or only render the relevant parts. If you clicked the link with a JSON-call, it only loaded the relevant part. If you did not, you reloaded the entire site. This required a slightly ugly quirk for every template to be like this: Why would JSON have been better? With this, you do less processing in the browser, and in exchange do some extremely cheap templating on the server-side. It seems like a perfect solution. JS already has the faculties for injecting HTML, so it would be like two lines of code on the client, maybe ten on the server, perfect extensibility, and minimal performance costs. Compare this to serializing and deserializing JSON. It seems like a really clean solution. I don't see anything wrong with it. There was a certain photograph about which you had a hallucination. You believed that you had actually held it in your hands. It was a photograph something like this. |
neologix |
Posted on 19-06-04, 02:20
|
Post: #40 of 49 Since: 10-29-18 Last post: 1921 days Last view: 1806 days |
Among other professional programming endeavors, I am a web dev and have been doing it professionally since 2006 though I've been practicing since 1997. There's a lot of conflation here of terrible "modern" practices, both front- and back-end web dev, like forcing the use of Node/Grunt/supposedly easy "package management solution" du jour or Angular/Meteor/other unnecessary MVC frameworks with reasonable practices like version control and usable IDEs or editors. There's a lot I'd love to write to confirm and/or dispel some misconceptions some people here seem to have about front-end and back-end web dev in general, and/or HTML, CSS, JavaScript, and/or even PHP specifically. I'll need to take some time to get these particular thoughts organized, as a lot of it has to do with my personal experience of having learned and practiced everything from scratch first without competent IDEs or editors and with having constantly studied and having learned how to properly filter and curate resources like blogs and videos and hardcopy books. The tl;dr is that (web) dev process is primarily bureaucratic, controlled by whatever resources the boss is willing to allocate towards what they potentially feel is something any monkey, freelancer on Fiverr, or outsourced Middle Eastern Asian/Indian can do for less (i.e. what they feel it's "worth"). Given that, a competent (web) dev will be able to succeed regardless of what environment they're forced to work with due to workplace politicking and/or red tape because they'll have learned proper fundamentals and especially how to adapt. That's why I've earned the reputations of being The Fixer, being The Finisher, and being The Reverse Engineer, and those underpriced monkeys haven't. |
funkyass |
Posted on 19-06-04, 02:45
|
Post: #46 of 202
Since: 11-01-18 Last post: 680 days Last view: 35 days |
how much replication of inherent browser functionality using javascript is for tracking? |
Kawaoneechan |
Posted on 19-06-04, 14:56 (revision 1)
|
Mythbuster
Post: #252 of 599 Since: 10-29-18 Last post: 215 days Last view: 3 hours |
Just FYI, |
Duck Penis |
Posted on 19-06-04, 15:14
|
Stirrer of Shit
Post: #361 of 717 Since: 01-26-19 Last post: 1783 days Last view: 1781 days |
Posted by Kawa Oh no. I thought I should have included a disclaimer so you wouldn't get any ideas. You shouldn't blame wertigon for it though. I mean, don't blame me either (if you use that thing, whatever happens is entirely your fault), but he had nothing to do with it. There was a certain photograph about which you had a hallucination. You believed that you had actually held it in your hands. It was a photograph something like this. |
Kawaoneechan |
Posted on 19-06-04, 15:39
|
Draco in Leather Pants
Post: #253 of 599 Since: 10-29-18 Last post: 215 days Last view: 3 hours |
You're quite right, I blamed the wrong person there. I'd forgotten which one of you it was, scrolled up, decided it was werti and called it a day. Cos my day so far has been quite a load. Not removing it anymore though 😏 |
Nicholas Steel |
Posted on 19-06-05, 04:41 (revision 4)
|
Post: #209 of 426
Since: 10-30-18 Last post: 518 days Last view: 3 days |
Here's a handy add-on for Firefox: https://addons.mozilla.org/firefox/addon/facebook-container/ stops Facebook tracking. It... has the same caveats that all the other Social Media blocking add-ons have /sigh AMD Ryzen 3700X | MSI Gamer Geforce 1070Ti 8GB | 16GB 3600MHz DDR4 RAM | ASUS Crosshair VIII Hero (WiFi) Motherboard | Windows 10 x64 |
Kakashi |
Posted on 19-06-05, 14:36
|
Post: #127 of 210 Since: 10-29-18 Last post: 1896 days Last view: 1868 days |
There's also containers for Google, Reddit and Amazon. Probably more. There are also variants that do more than others. For example, there's a more complete version of the Mozilla Facebook container. |
Kawaoneechan |
Posted on 19-06-06, 16:28
|
20% cooler than thou art
Post: #254 of 599 Since: 10-29-18 Last post: 215 days Last view: 3 hours |
Guess what I just dug up? https://helmet.kafuka.org/abdelta |
Duck Penis |
Posted on 19-06-06, 17:09
|
Stirrer of Shit
Post: #366 of 717 Since: 01-26-19 Last post: 1783 days Last view: 1781 days |
It feels futuristic, definitely. I like the Windows-esque error messages. There was a certain photograph about which you had a hallucination. You believed that you had actually held it in your hands. It was a photograph something like this. |
neologix |
Posted on 19-06-08, 02:33
|
Post: #42 of 49 Since: 10-29-18 Last post: 1921 days Last view: 1806 days |
That ABD ain't bad at all. Now howsabout pointing me to a board theme so I can contribute one to this board, eh? |
Kawaoneechan |
Posted on 19-06-08, 05:50
|
Secretly, I'm Rainbow Dash
Post: #260 of 599 Since: 10-29-18 Last post: 215 days Last view: 3 hours |
http://helmet.kafuka.org/bboard/css/zenburn.css |
neologix |
Posted on 19-06-09, 23:17
|
Post: #44 of 49 Since: 10-29-18 Last post: 1921 days Last view: 1806 days |
Only the CSS? Aren't there PHP-based template components like most other PHP forums? |