Let's say I consider the meme about FOSS programs having perplexingly retarded visual interfaces true. Are there any resources out there to learn the ropes of good design then? Something between "just fork it and make it better" and "we design UX with love, not UI" so it's actually productive. I know there is the Nielsen Norman Group for the web, but that's only one resource.
I think that FOSS software will have inherently worse designs because there is no customer who's needs must be satisfied, though there can be donors which helps. Apart from this I think that better/faster (get to the end result) documentation would also be helpful. The best design considers people/the users first which, for example, you don't see in the GNOME project.
You'll likely be using a library unless you want to reinvent the wheel.
The best GUI/TUI is the one that you don't need to read a manual to use. It should be as intuitive as possible to non-autist, in other words, sit random intelligent people down in front of it and ask them what overly complicated or hard to find. This is where the autists fail at design, they create interfaces that only make sense in there own minds and seem esoteric to everyone else.
If you're going the TUI route, I've heard ncurses made some "interesting" design choices that other curses libraries may have avoided.
>>4094 Ncurses made choices that go well with limited color terminals. Now that everything is using a terminfo similar to xterm-256color, there's no point in using it, unless you like memory leaks.
Try termbox and you'll see what a decent TUI library is like.
I've yet to see a TUI library that isn't just "draw this cell" or "draw this cell in an arbitrary area you have to define for no reason".
>pretend the problem with FOSS is complex GUI
>instead of an utterly simple braindamaged bloated piece of shit that doesn't even let you do anything, like Eye of Gnome or Gthumb
Just take a course on UX. >>>/HN/ is a good start
>TUI
There's no such thing. That's just a GUI constructed with text, which is a horrible idea.
>>4142 >TUI
>There's no such thing. That's just a GUI constructed with text, which is a horrible idea.
Aktually it's pretty useful if restricted to textmode only (e.g. embedded, remote access), but want something a little nicer looking (e.g. htop vs top, or tmux vs screen). You need a GPU to use a GUI.
>>4094 No that's cancer. The best UI is one which lacks UX for basic bitches; the best UI being the one which allows the power user to get work done the fastest and with much dynamism. So lots of knobs and buttons plainly visible and ready with extensive keyboard macros bound to them. Look at AutoCAD and Siemens NX, definitely not for the basic bitch luser!
The way to do GUIs right now is to write directly to a framebuffer. All higher level libraries are complete shit. someone could simply make a high level GUI library that isn't shit, but that's not currently possible because they are preoccupied with sucking corporate dick
>>4094 >The best GUI/TUI is the one that you don't need to read a manual to use.
You don't know what you're talking about. Even the most simple applications of software need to use constructs that are well documented as opposed to just "lol when i type this and that, it seems to do what i want in this case on my machine".
>This is where the autists fail at design, they create interfaces that only make sense in there own minds and seem esoteric to everyone else.
Got an example? Most GUI libraries are exactly the same shit rehashed: GTK, wx, QT, GDI, hell even X11 isn't a big change from these.
>>4104 TUIs are shit. why would you even use them when you could just use a framebuffer and be simpler, faster, more usable, more secure (yes, terminals are insecure, which sounds stupid, but is reality because of how utterly badly designed they are)
>>4171 >embedded
how do you get ASCII or utf8 text out of your embedded system? it had to be programmed. it can be programmed to use any protocol just as well
>remote access
>derp i need to ssh into a machine and execute stuff in the terminal on that machine
not a valid use case outside of UNIX braindamage
>you need a GPU to use a GUI
you'd think that after years of GTK braindamage
>>5906 >>The best GUI/TUI is the one that you don't need to read a manual to use.
>You don't know what you're talking about. Even the most simple applications of software need to use constructs that are well documented as opposed to just "lol when i type this and that, it seems to do what i want in this case on my machine".
Exactly this. The best GUIs and TUIs have the manual integrated into the UI. Hover over something and click on the question mark to receive an explanation of what it does.
>TUIs are shit. Why use one when you can use a framebuffer
remote connection, eg SSH.
>not a valid use case outside of UNIX braindamage
Given the two choices are windows and unix, I personally prefer the latter.
>>5906 >The way to do GUIs right now is to write directly to a framebuffer.
Unless you mean a DRMFB, enjoy your slow shit.
>why would you even use them
Let's say X11 is actually garbage that actually can crash or hang and text-based interfaces are the ONLY way to escape this. I wouldn't mind having reliable-as-a-brick graphic interphace, but it's not the reality at the moment.
Writing to your own framebuffer most probably means writing your own GPU drivers for that framebuffer.
>it can be programmed to use any protocol just as well
That is true. However, text-based interfaces are just naturally more easy on the bandwidth, and that is always a concern. Needless to say, there is nothing better than VNC/RDP in this shitty world.
>not a valid use case outside of UNIX braindamage
Are you fucking serious? You ALWAYS want the connection to something of yours today. It doesn't have to be SSH, it doesn't have to be text, but people use that shit all the fucking time. Regular people, mind you, not Unix admins.
It could be a website, I guess, but not everything can be done through that, and for anything serious it would require non-trivial effort that'd be better spent on programming the actual app or, hell, even the actual framework/toolkit with network transparency.
>you'd think that after years of GTK braindamage
Do you use an actual hardware terminal or what the fuck am I hearing right here? You always need a GPU, even for TUI. The question is if you want to run it in backwards[-compatible] mode or you want to extract more potential from this NSA backdoored shit.
But anyway regarding the subject of TUI.
I think it's not worth it. Like, the effort spent on the design of a good TUI app is not worth it if you can do the same shit in GUI reliably.
It can be argued that a lightweight remote connection is good to have, and I would agree, but not a lot of applications would actually need that.
Also an additional simple thought is that curses sucks.
Though I think it's still worth just using the shell and simple shell programs or scripts that usually exit right after being called, producing some output and/or doing some work. Over a remote connection or not, that shit is simple enough to design and use, and hard to fuck up. The most obvious use of that would be ffmpeg.
>>5931 You need GPU to render this shit at the very least.
The more contemporary (like the last 15 years already LOL) trend would be to use some DirectX/OpenGL shit to draw and manipulate graphics loaded directly into GPU, saving CPU cycles from actually rendering that shit, than sending it over the system bus to memory that is shared via some dark magic with your GPU. Or whatever along these lines, I wouldn't claim to know how exactly the contemporary mobos work. The point is there are tools to manipulate graphics faster than drawing all the shit with CPU, and everyone wants them, because this is look-and-feel.
>>5932 >Over a remote connection or not, that shit is simple enough to design and use, and hard to fuck up.
Yes, ostensibly simple and hard to fuck up. See UNIX braindamage threads on here and 8gag/tech.
>The most obvious use of that would be ffmpeg.
You mean the most obvious example of fucking up?
>>5933 And yet GPU rendering is slow and buggy as fuck in practice (perhaps only because the devs of stuff like GTK and cairo are retards, but even if you did everything properly, I'd think there's still no way to do vsync without input lag when using the GPU for rendering). I implemented an entire image viewer (and some other applications) with CPU rendering, and it's better in every way than every other GUI I've ever seen. Start rendering from top to bottom at VBLANK, render the entire thing in ~3ms. Even works on my 13 year old laptop.
>>5928 >Unless you mean a DRMFB, enjoy your slow shit.
You mean the DRM API of Linux? What I mean by framebuffer is simply an area you can write to where GPU scans out to the monitor.
>Writing to your own framebuffer most probably means writing your own GPU drivers for that framebuffer.
Linux provides two portable APIs for framebuffers, the "framebuffer api", and DRM.
>Are you fucking serious? You ALWAYS want the connection to something of yours today.
It should be done over tailored protocols where needed (but webshit for example which is 99% of ssh use cases today shouldn't even exist - you should just hit publish and a content-addressable network instantly mirrors all your content). For example I used to listen to music on phone / some computers, then send the names of the tracks to a Tor hidden service (so I don't need an IP address or domain name or any such UNIX braindamage, but also because this is how the internet should be by default), which used a simple protocol with authentication (so it doesn't get spammed) that will post the tracks I listen to on last.fm. Such protocols get even easier when you leave UNIX braindamage land and no longer have to deal with serialization bullshit.
Now let's compare the UNIX braindage implementation of this protocol:
1. buy a domain name or static IP address
2. SSH into the machine which has the last.fm communication tool
3. send a bunch of strings of text as commands to be executed to call my last.fm communication tool (e.g lastfm_update --artist "justin beiber" --songname "aids" --album "dicks")
4. eventually some music has some special character in the name, which breaks this, and someone watching my last.fm observes this and sends me a song specially crafted to get root access in my SSH session
>Do you use an actual hardware terminal or what the fuck am I hearing right here?
The guy I replied to looks to be claiming that you need to use GPU rendering to make a GUI. Which is false. Even in practice.
>>5935 >Yes, ostensibly simple and hard to fuck up. See UNIX braindamage threads on here and 8gag/tech.
How often do you fuck up ls or ps command? Come on now.
>You mean the most obvious example of fucking up?
ffmpeg -i muhvidya.mp4 -c:v vp9 -c:a libwhatever -moreoptionstorenderagoodvid muhvidya.webm
Whatever along these lines will just work on some remote machine with a strong CPU or something, and you can automate this shit easily.
Now, ffmpeg is probably not the best example of the simple software, but it is simple enough compared to shit like sonyvegas or whatever else.
> And yet GPU rendering is slow and buggy as fuck in practice
X is slow and buggy as fuck in practice. GPU - I am not so sure.
>I implemented an entire image viewer (and some other applications) with CPU rendering, and it's better in every way than every other GUI I've ever seen.
I am genuinely interested, can I see the code?
But anyway, a few questions:
1) What resolution do you ran this shit on?
2) What is the software stack you're using anyway? DRMFB? X facilities? libsdl? Some other kernel framebuffer?
3) Can you run this shit on a 1080p monitor in native resolution?
4) Did you try to write a video renderer via whatever you're using?
>>5937 >You mean the DRM API of Linux? What I mean by framebuffer is simply an area you can write to where GPU scans out to the monitor.
So, you need GPU to work with that shit. BUSTED xD
>The guy I replied to looks to be claiming that you need to use GPU rendering to make a GUI. Which is false. Even in practice.
No GPU rendering means CPU rendering only, which means ALL transitions from a model to a bitmap have to be done with CPU. Ain't nothing exactly wrong with that, but it is slow for some tasks, like maybe window dragging, smooth scrolling or some shit like that. Now, I realize and agree that it isn't essential to the GUI per se, but UI smoothing like that is nice.
>Linux provides two portable APIs for framebuffers, the "framebuffer api", and DRM.
I remember using raw framebuffer when sitting in loonix VTs, but something like vesafb in X was slow as balls.
>For example I used to listen to music on phone / some computers, then send the names of the tracks to a Tor hidden service (so I don't need an IP address or domain name or any such UNIX braindamage, but also because this is how the internet should be by default)
You do need an "external" IP to run an onion service, I believe. I don't see how this is a "UNIX braindamage" issue, sorry.
> Now let's compare the UNIX braindage implementation of this protocol:
>4
The golden rule of any parsing/lexing is to make it as dumb as possible. If you don't want any special characters, don't support them.
And anyway clearly typing all this shit is already tedious as fuck, and you probably would like to use GUI here. And now, you want to use limited parsing or whatever of some webpages on lastfm, which is a service you don't control, which means you have to bow to the overlord changing the site or even closing it as they please, and basically this is fucking stupid. Just put up your own database or some shit.
>5939
>X is slow and buggy as fuck in practice. GPU - I am not so sure.
What I'm saying is that every program that does GPU rendering is slow. GPU rendering is probably (definitely) fast, but the users are all retards.
>1) What resolution do you ran this shit on?
I think the highest I tried was around 1080p@60Hz.
>2) What is the software stack you're using anyway? DRMFB? X facilities? libsdl? Some other kernel framebuffer?
Initially I used the "linux framebuffer" API (yes with vesafb and some other implementations) but then switched to DRM or "DRM-KMS" because it has more useful features. "linux framebuffer" has no vsync signal (it has it as an option, but nothing supports it, for example vesafb doesn't). I plan to port to Windows soon but dunno yet how you get direct framebuffer access there. Even if I have to hack it in I'll do it.
> 4) Did you try to write a video renderer via whatever you're using?
No but I don't see a big problem there. The big part of mpeg based shit is decompressing a keyframe (I think they're called I-frames, but there are other types that may matter as well), which I assume is roughly equivalent to decompressing a JPEG file of 1280x1080 or whatever small resolution the video is in. The only hard part of my image viewer is decompressing a JPEG because at 16000x10000 it takes a long time on any CPU. It simply stores the decompressed image in memory (taking hundreds of MB), but that's the only way to actually have realtime panning/zooming - because we can't decompress the JPEG at each frame (actually it may be possible, you can decompress only a region of the file at a time, which is one of my future things to try because it will mean 0 input latency during loading the next image in a gallery, which no other image viewer even remotely supports).
I may post the code here later if I get time. It has 10K lines of experiments with scaling that need to be removed, and some simple portability issues to fix. Also the scaling needs to be written in assembly to be fast for up to 1/4 downscale and my current (x86) assembly solutions are slow: ~50ms at 1280x1024 when approaching 1/4 downscale. 1.6GHz/4 billion instructions per second peek - the 13 year old laptop I mentioned and the fast one I have is somewhere around 20-30ms but it only downscales 908/256 and needs to be generalized. But scaling is an edge case, CPU rendering in general I believe is viable for most applications.
>>5941 >No GPU rendering means CPU rendering only, which means ALL transitions from a model to a bitmap have to be done with CPU. Ain't nothing exactly wrong with that, but it is slow for some tasks, like maybe window dragging, smooth scrolling or some shit like that.
My image viewer has smooth scrolling. It writes out every single pixel to the monitor at each frame, and takes only a few ms. Now if you scale the image, it will take on the order of 10s of ms depending on your CPU, which long story short will mean input lag if it gets slower than your refresh rate. But the point is, you can write out to every pixel on the monitor using the CPU each frame and still be fast. I don't see why dragging a Window should be slow with CPU rendering either and if it is, it's probably an X11 or GTK problem.
>You do need an "external" IP to run an onion service, I believe. I don't see how this is a "UNIX braindamage" issue, sorry.
IP addresses and domain names are pure UNIX brain damage. look at your shell, does it say "localhost" on each line?
>The golden rule of any parsing/lexing is to make it as dumb as possible. If you don't want any special characters, don't support them.
Yeah, I can put '' around my arguments, and then whitelist only alphanumeric characters and spaces. But that's shit and still could be insecure who knows, I don't know the grammar for the shell. But I already solved the problem by not taking the UNIX route.
> And anyway clearly typing all this shit is already tedious as fuck, and you probably would like to use GUI here.
In the UNIX braindamage process, I meant there's a program that SSHes into the machine and runs the lastfm_update command. None of this would be manual.
>Just put up your own database or some shit.
It will still have parsing issues. There's no way to work with abstract syntax trees with any SQL server I know of. But anyway I already solved this problem. The solution is to not use UNIX or any derivative of it (nor windows or mac).
>908/256
I meant 256/908
1/4 means making the image 1/4th the size. 256/908 means making the image 256 908ths of the original size (which is still roughly 1/4, but 256/908 the slowest scaling ratio so I optimized specifically for it)
>>5943 >GPU rendering is probably (definitely) fast, but the users are all retards.
Maybe the issue is that it is not always appropriate to use actual rendering commands for displaying shit.
>DRM or "DRM-KMS"
Well, the thing about that is, I believe, something-something-direct-access-to-the-gpu, so no wonder it's fast.
https://dri.freedesktop.org/wiki/DRM/ >I may post the code here later if I get time.
Thanks, I would appreciate it.
>But scaling is an edge case, CPU rendering in general I believe is viable for most applications.
Well, scaling is rather important, especially for video rendering when we have to align some video to the screen when the resolutions don't match. Other than that, resizing windows are probably better off cropping content than actually scaling it in realtime. Though scaling this page with all those pictures would probably take longer with CPU rendering too.
But I'm curious how much can be done with the CPU here. Though arguably that would have to be some beefy CPU with CISC arch, so that's unfortunate.
Also another point is that we would actually load the CPU by doing this shit, which is not always desirable. Using GPU where appropriate is probably a good idea.
>IP addresses and domain names are pure UNIX brain damage. look at your shell, does it say "localhost" on each line?
It says my custom name of the machine, but anyway, you either assign "human-readable" names via some system or you assign some whatever names via some algorithm, what exactly is UNIX's fault here?
>still could be insecure who knows, I don't know the grammar for the shell
I don't know what you're talking about. AFAIK, anything between '' is to be interpreted literally, meaning no funky business with dollar signs or whatever. Stuff between "" is another story though, now that shit can surprise you.
>But I already solved the problem by not taking the UNIX route.
Do you have your own OS ready already or what the actual fuck?
>In the UNIX braindamage process, I meant there's a program that SSHes into the machine and runs the lastfm_update command. None of this would be manual.
So, you hypothetically propose to write a program that talks to a UNIX braindamaged program that talks to the actual place. This isn't really the UNIX braindamaged way, sorry. The UNIX braindamaged way would be to actually use ONLY your UNIX braindamaged utilities. God, I think this paragraph has too many instances of "UNIX braindamage". xD
>There's no way to work with abstract syntax trees with any SQL server I know of.
You don't have to use the database software for it though.
Anyway the issue wasn't really that you parse some braindamaged HTML for no good reason, which is pretty braindamaged still, but besides the point. The issue was that you'd have to change your semantic model from time to time, like with changing said page structure, or the page owner being a dick and requiring you to use JS as well, or something else. So, with browsing the web you would have to essentially develop your own browser if you think the current browsers are braindamaged. That was the point.
>Well, the thing about that is, I believe, something-something-direct-access-to-the-gpu, so no wonder it's fast.
m8, to make graphics, you have to go through a GPU in some way or the other. framebuffer is the most simple posible thing and even the VESA BIOS extensions give you enough to use it. what i mean by CPU rendering, is writing directly to the framebuffer. you write 0x00ff0000 to framebuffer[500][500] and a red dot appears on your screen. GPU rendering on the other hand means either using some of the low level stuff on the GPU, or using the likes of OpenGL to manage a bunch of abstract objects. GPU rendering means you need a ton of software to use your GPU, and you have to have an advanced GPU, whereas even my voodoo3 has a framebuffer just as good as any (yeah voodoo3 is bloated as well probably)
>Well, scaling is rather important, especially for video rendering when we have to align some video to the screen when the resolutions don't match.
Scaling isnt implemented very good in video players either. And you usually don't do a massive downscale like 1/4 since videos are pretty low-res. 4K video downscaled for a 1080p monitor is only 1/2 scale which isn't bad. You can always simply use the GPU to scale an image because it's massively parallel, but I don't know how good that works. The one image viewer I found with GPU scaling (FastPictureViewer) has input lag.
>Other than that, resizing windows are probably better off cropping content than actually scaling it in realtime.
Well at least you can easily select another fixed size font or double a font in real time. I know current fonts do more advanced vectorized crap (and subpixel rendering) but I think that's all BS. I think anything that isn't scaling a full image will be even faster than my image viewer.
>Also another point is that we would actually load the CPU by doing this shit, which is not always desirable. Using GPU where appropriate is probably a good idea.
It's a bit of a problem but you can get by probably using a small set of your CPUs. I mean if you just cache the scaled image it only takes 3ms to redraw the screen when you pan/scroll. Only realtime scaling would cause high CPU load. I plan to explore the GPU later but I wouldn't be surprised if there's no way to draw vsynced without input latency because of their crap API (yes, people will say you need gsync/freesync for this, but they're wrong, as demonstrated by my CPU rendering programs).
>It says my custom name of the machine, but anyway, you either assign "human-readable" names via some system or you assign some whatever names via some algorithm, what exactly is UNIX's fault here?
Don't worry about it.
>AFAIK, anything between is to be interpreted literally, meaning no funky business with dollar signs or whatever.
yes, "AFAIK". that's the whole problem. even if i read the spec or source, they will change the syntax next week. granted in this case i dont think its likely you can escape from but its still not a good way to design a system to rely on shit like this
>So, you hypothetically propose to write a program that talks to a UNIX braindamaged program that talks to the actual place. This isn't really the UNIX braindamaged way, sorry. The UNIX braindamaged way would be to actually use ONLY your UNIX braindamaged utilities.
Well the webshit/UNIX braindamage fad lately is to make a command line program that talks to a webshit via JSON/"REST", and I doubt there's a way to grant limited access to your last.fm account to all your devices (so you need to make an authenticated proxy), so it's not far off.
>Anyway the issue wasn't really that you parse some braindamaged HTML for no good reason, which is pretty braindamaged still, but besides the point. The issue was that you'd have to change your semantic model from time to time, like with changing said page structure, or the page owner being a dick and requiring you to use JS as well, or something else. So, with browsing the web you would have to essentially develop your own browser if you think the current browsers are braindamaged. That was the point.
I was only submitting data to last.fm. Not parsing any web pages. They have a "REST" API. When I listen to A Love Supreme by John Coltrane, my program submits to last.fm/blah/logmusic?artist=john+coltrane&song=a+love+supreme (after the music player on my phone for example sends this info to my server)
oh are you saying i would want to view the last.fm data as a separate thing from what i already have? yes that's a problem too. the only way is to make a dedicated parser for last.fm (or if they have an API for viewing - not submitting - you could use that, but it would still be braindamaged HTTP/JSON). as you say, web browser is brain damaged. and it's even worse than anything else, so the only solution is to parse the webpage and display it in your own GUI
>>5948 >m8, to make graphics, you have to go through a GPU in some way or the other
Yea, I realized somewhere during our conversation that I would mix up GPU rendering and stuff like DMA for GPU. Still, the actual GPU rendering could be faster and yes, it would require some special libraries to work at all.
>You can always simply use the GPU to scale an image because it's massively parallel, but I don't know how good that works.
Apparently it works so good that mpv devs use the opengl renderer by default. There is also an xv renderer, which uses XVideo extension and "may be low-quality, and has issues with OSD and subtitle display", which utilizes "hardware acceleration" somehow though. There is also x11 renderer, which drops a fair bit of frames when displaying a downscaled (it drops less when native) 1080p@60 video on my PC and which consumes about twice as much CPU time as xv or opengl. x11 is allegedly "without hw acceleration".
So all in all, I think GPU is a good help, especially considering how dogshit CPUs may be. With videos there's been a possibility to decode certain codecs on the GPU itself as well, like DXVA, UVD or whatevers. It's even more energy-efficient, I think.
>I know current fonts do more advanced vectorized crap (and subpixel rendering) but I think that's all BS.
Subpixel rendering is BS, but the rest isn't.
Anyway the thing with TTF fonts is their bitmaps usually get cached so it's not that awful, and I think vector fonts are the future anyway. Like, we get 300DPI displays anywhere and we won't be caring about hacking font rendering ever again realistically. RN I'm using fonts with full hinting because otherwise they are too blurry on my 100DPI (usual for these) display.
>I plan to explore the GPU later but I wouldn't be surprised if there's no way to draw vsynced without input latency because of their crap API
Well, newer GPU have also Vulkan API, but I know nothing about it. Anyway I don't really know what problem you're referring to. Programs utilizing opengl api work just fine on my PC LOL.
>yes, "AFAIK". that's the whole problem.
I'm pretty sure single quotation marks are in the POSIX standard, which shells usually don't violate, unless they are explicitly non-POSIX.
And POSIX standards are not as easily rewritten as lastfm webpages.
>and I doubt there's a way to grant limited access to your last.fm account to all your devices (so you need to make an authenticated proxy), so it's not far off.
I don't get what you're trying to accomplish here but how is proxying a UNIX braindamage now? I mean, come on, let's just address the issues with the C language then LMAO.
>They have a "REST" API.
API is actually much better if they keep it stable. Like, there is no braindamage here unless you have a better idea for the protocol or whatever. Honestly, it doesn't even have to be through a webserver, necessarily, nor through the TCP.
>oh are you saying i would want to view the last.fm data as a separate thing from what i already have?
Yea, I was talking about that.
>so the only solution is to parse the webpage and display it in your own GUI
Yep, that what I said, you would likely to be reimplementing a good chunk of browser functionality. And then the site moves to HTTP/2 and you have to rewrite the whole protocol thing or link to libcurl or something. Like, whatever, I want to reimplement a browser too eventually, but that wouldn't be as easy.
>Subpixel rendering is BS,
vector graphics look bad, RIP your avatar. and fonts are so small that they have to be hand crafted to look good
>but the rest isn't.
what about when a chinese character is to be displayed and the program freezes for 10 seconds? whatever they're doing isn't working and isn't even a real problem
I don't see any reason for high resolution fonts. You should be making the fonts big to avoid straining your eyes for hours while reading, or better yet using text to speech with audio player features (pause/play/rewind/looping/marking). Reading has always been a problem even before computers.
>Well, newer GPU have also Vulkan API, but I know nothing about it. Anyway I don't really know what problem you're referring to.
You need vsync to render animations properly. But the standard ways to do it introduce input lag. So when your scrolling it's annoying because the output reacts to your movements slower than it should (this can be bypassed somewhat by increasing the refresh rate as that reduces the effective input lag). I know GPU should work well, but the only thing I worry about is that it may introduce input lag.
>I don't get what you're trying to accomplish here but how is proxying a UNIX braindamage now?
I mean the UNIX braindamage way to implement my last.fm updating would be to make a server (the proxy) that my phone connects to which it posts the names of songs im listening to. That server will call some program like lastfm_update, which is a command line tool, and string injection vulns will be introduced here.
-song is named "blah song'; rm -rf / #"
-phone listening to song sends "blah song'; rm -rf / #" to my server
-my server calls "lastfm_update --artist bleh --song 'blah song'; rm -rf / #'"
the moment you want to use a string in a UNIX system you have to do all these workarounds like escaping your ' or using some shell command to launch a process with an argument vector, which doesn't really work well either and in both cases they're prone to all kinds of other problems. and the UNIX way to view a file is to output it to your terminal, which means metacharacters change all your shell state (and on bad shells leads to remote code execution)
>Like, there is no braindamage here unless you have a better idea for the protocol or whatever.
Yes I have better ideas like prefix tree serialization of typed data instead of plaintext format du-jour (which is JSON today), and no HTTP braindamage.
>Yep, that what I said, you would likely to be reimplementing a good chunk of browser functionality. And then the site moves to HTTP/2 and you have to rewrite the whole protocol thing or link to libcurl or something. Like, whatever, I want to reimplement a browser too eventually, but that wouldn't be as easy.
Well, the final solution is making dedicated programs for each website or class of websites. This is how it will have to be until the web finally dies. I don't even bother parsing HTTP, I just seek to the part of the output I'm looking for. Not really sure what the best strategy is but I don't see any point in pretending HTTP is a real protocol. E.g a website may still return the data you want and give a 404 error at the same time. Some library that raises an exception on 404 isn't much help. I guess in some cases the way to distinguish results from the web page would be to parse the header, but even then I'm still not even sure it's worth picking up or making an HTTP library for that instead of just adding 3 units to the top of your parsing machine.
>>5953 >vector graphics look bad, RIP your avatar. and fonts are so small that they have to be hand crafted to look good
What?
>what about when a chinese character is to be displayed and the program freezes for 10 seconds?
Literally never happened to me.
> I don't see any reason for high resolution fonts. You should be making the fonts big to avoid straining your eyes for hours while reading
Even 12pt font is rather small, though it's not really the point. The point is hidpi display will show same physical size fonts in higher resolution, which WILL be easier on eyes and you will say goodbye to all grayscale antialiasing issues. Heck, you probably could forget antialiasing altogether, that's how high DPI may be. And there is nothing special in "high resolution" TTF fonts, it's just vector fonts rendered to a bigger bitmap.
>Reading has always been a problem even before computers.
The point with computers is that DPI of regular monitors is at most half of the "paper DPI" ( contemporary printers give at least 200-300 DPI usually).
>I mean the UNIX braindamage way to implement my last.fm updating would be
…to have the whole solution as a script, working like this: your app is a script too, and it calls lastfm_update via the shell. That's all. What the heck do you need the proxy for?
Anyway, another legit form of IPC in UNIX is sockets or heck, even pipes might work.
>the moment you want to use a string in a UNIX system...
The moment you launch a shell in a UNIX system, yea.
>and the UNIX way to view a file is to output it to your terminal, which means metacharacters change all your shell state
Usually people do it via lesspipe, which is a funny violation of UNIX braindamage. Well, not really a violation per se. Cat is not really for displaying files on your terminal though. Cat is for concatenation of files.
>Yes I have better ideas like prefix tree serialization of typed data
Could you give an example, 'cause I don't really follow.
>Well, the final solution is making dedicated programs for each website or class of websites.
This is fucking dogshit. I mean, it will work, but it's not worth the effort.
>I don't even bother parsing HTTP, I just seek to the part of the output I'm looking for.
I'm not sure, but unless you don't need cookie identification, this isn't gonna work. Same with content-length probably. Anyway I'm not much of a buff in HTTP, how do you "seek" to the data exactly? \r\n\r\n? Do you not want the length header? Datatype header? Whatever, I guess. Maybe the minimal HTTP grabber is easy to do, but we didn't even start talking about HTML parsing.
>>5906 >The way to do GUIs right now is to write directly to a framebuffer.
you are fucking idiot
if you don't use system native library but framebuffer, your application won't look and feel native, it will be unique nigger shit, with zero consistency
each unix shit app is looking different
only OS native calls should be used to construct and handle a GUI. you can use a third party library if it then calls OS library
>>5913 >remote connection, eg SSH.
why doesn't SSH nigger work with GUI? it could send the screen of the machine you are connecting to
>>5958 >why not use GUI over SSH?
I don't want to install a whole graphical environment on a server. If it's linux, then all administration is done via command line anyways. Plus, most GUIs work poorly over a high latency, low bandwidth, intermitent connection so as my internet. But I'm sure some of them work well, so long as they aren't doing something retarded like writing directly to a framebuffer.
>>5954 Antialiasing makes shit blurry, and requires either a GPU or a ton of CPU time. I don't know all the tradeoffs here, but I don't see any problem with just drawing glyphs pixel by pixel. If your screen is some insane resolution, you just double/triple/etc each pixel in both dimensions, and it will look more or less like on a normal screen. I don't see how vector graphics can work for a very small font size. They will destroy the artist's intent. Like some line will look stupid because it just puts part of it in the 2-4 available pixels, whereas an artist can choose the most optimal looking placing. And even at bigger resolutions vector graphics are still are a machine extrapolation of what an artist made. I don't really want to search for an example. Take it or leave it.
>What the heck do you need the proxy for?
The phone and other untrusted devices connect to the trusted machine to submit to last.fm. That trusted machine is the proxy. This way the untrusted machines can only log plays of music, instead of messing with the account. This is a very simple thing to achive outside of UNIX, but the moment you try to have any proxy like this in UNIX, its a nightmare.
>>Well, the final solution is making dedicated programs for each website or class of websites.
>This is fucking dogshit. I mean, it will work, but it's not worth the effort.
Well I'm going to start a community doing it, so it wont all be one guy doing scripts for every website in the world. Also it will be possible to securely use other people's untrusted parsers (in a proper way, not some sandbox du jour). The alternative is to implent a browser, which is utter dogshit, and has to be slow and insecure as a requirement in order to be able to work on any web page.
>>5958 >if you dont use the official libs your shit wont be in the official style
that's all good and all but the official libs are fucking AIDS. GTK not even once
>it will be unique nigger shit, with zero consistency
yes, unique but also the only program that actually works
>only OS native calls should be used to construct and handle a GUI. you can use a third party library if it then calls OS library
there should be primitives to make GUIs that are actually useful and composable (unlike Windows or GTK or any Linux shit or DOM). people need to go back to using the raw framebuffer to discover these
>>5968 >I don't know all the tradeoffs here, but I don't see any problem with just drawing glyphs pixel by pixel.
I dunno what you mean, but vector fonts (TTF, OTF, etc) do not have "pixels". They start having "pixels" once rendered from the vector form to the bitmap form by the rendering engine. And anyway, yea, once they are in a bitmap form, they are drawn "pixel by pixel" or whatever, it's pretty much just bitmaps waiting to be pushed into a framebuffer.
>If your screen is some insane resolution, you just double/triple/etc each pixel in both dimensions, and it will look more or less like on a normal screen.
That approach will just give you huge squares instead of pixels. There is no point at using the large resolution at all if you're going to do this.
>Like some line will look stupid because it just puts part of it in the 2-4 available pixels
No, it won't be like that. It's what happens NOW with lower font sizes, actually. With huge resolution, a glyph would be rendered not into like 5 pixels, but into like 100. See? Imagine that your pixels are going to have smaller physical size, thus the larger glyphs becoming the same physical size on the hidpi screen. Of course, that would mean bitmap fonts would have to be rewritten or removed, and all UIs somehow hardcoding small (or any, actually, it's a bad practice as far as typography is concerned) font sizes in pixels (like, 10px in CSS rules) have to be ditched, but the gain for dem eyes is just insane. And of course that would be harder on the hardware, like, obligatory 4k monitors everywhere, but it's not like we're in the 90s really.
>And even at bigger resolutions vector graphics are still are a machine extrapolation of what an artist made.
At 300 DPI they are going to be virtually indistinguishable. Or maybe ACTUALLY indistinguishable, look up the maximum optical resolution of human eye or something. Like, I don't see you complaining about serif glyphs on paper LOL.
>This way the untrusted machines can only log plays of music, instead of messing with the account.
I am not sure about this trust model. You rely on your "trusted machine" to "mess with the account" instead of the code on your "untrusted devices". Like, unless you're going to cut the "untrusted devices" from the Internet completely, this doesn't make sense.
And anyway, I have had enough of your bullshit about UNIX strings or whatever. The proxy server in question doesn't have to launch shell at all, it doesn't have to access the filesystem, it doesn't have to do SHIT of what you're so afraid of.
>Well I'm going to start a community doing it, so it wont all be one guy doing scripts for every website in the world.
Good luck with that, I guess.
>Also it will be possible to securely use other people's untrusted parsers (in a proper way, not some sandbox du jour).
It really wouldn't be much about trust (after all, people run untrusted shit all the fucking time), it would be first and foremost about the completeness of your solution, so…
>The alternative is to implent a browser, which is utter dogshit, and has to be slow and insecure as a requirement in order to be able to work on any web page.
…the alternative browser which implements only non-cancerous Web parts might be just right IMO. Also if it gets enough, I dunno, public attention, one could try to implement saner protocols/formats and actually make a difference. Like, for example, the only real improvement over raw HTML that JS provides in my opinion would be that auto-filling of input forms by clicking some page objects. Honestly, that's it. Some async Ajax or whatever bullshit isn't necessary if you don't do some cancerous shit like browser games or, I dunno, web chat or something.
>and has to be slow and insecure as a requirement in order to be able to work on any web page.
Browsers are not slow, [some] webpages are. NoJS pages would load fast as shit.
>that's all good and all but the official libs are fucking AIDS. GTK not even once
I think the good point would be that we better implement an actual toolkit then, like, with primitives and elements and shit.
Like, I've been thinking actually about a toolkit with a network transparency built in. Like, it would send some rendering commands over the network, like draw checkboxes there and dropdown menus here, and the local machine would actually render that shit. It sounds kinda doable. The main point I wanted to make is that we have to avoid sending bitmaps over the network. Like, preferrably, that shit has to be banned completely.
>yes, unique but also the only program that actually works
Nah, the GUI standards set by Xerox are pretty much universal TBH. The main difference comes mostly from theming and from retarded decisions by toolkit devs. Like, honestly, did you know that Shift-Insert is hardcoded to paste from X Clipboard selection in GTK, like, what the actual FUCK? I'm not clicking the middle mouse button all the damn time!
>people need to go back to using the raw framebuffer to discover these
People don't need to do shit, really. Toolkit just needs to be there and needs to be simple to make something useful with.
>>I don't know all the tradeoffs here, but I don't see any problem with just drawing glyphs pixel by pixel.
>I dunno what you mean,
I mean each glyph is drawn pixel by pixel by hand
>That approach will just give you huge squares instead of pixels.
At 1024x768 20" with simple fonts I don't see any squares insofar as they make it harder to read. But yeah I dunno, maybe I'm just used to low res fonts.
>There is no point at using the large resolution at all if you're going to do this.
There is a point. For images.
>Of course, that would mean bitmap fonts would have to be rewritten or removed
no you could just double them. something made for 1024x768 15" will look practically the same on 2048x1536 15" when doubled in both dimensions
and none of this requires the UI to hardcode font sizes
CSS problems are fabricated
>I am not sure about this trust model. You rely on your "trusted machine" to "mess with the account" instead of the code on your "untrusted devices". Like, unless you're going to cut the "untrusted devices" from the Internet completely, this doesn't make sense.
How does it not make sense? The phone is a piece of shit that gets exploited every 5 minutes. So after 5 minutes it will take over my account and fuck up all the settings. That's why it can't have unnecessary permissions. It doesn't have the password for my account. It has to go through my server which only lets it do a subset of possible actions.
>The proxy server in question doesn't have to launch shell at all, it doesn't have to access the filesystem, it doesn't have to do SHIT of what you're so afraid of.
Lol it's just an example of how UNIXtards would implement this shit. They _would_ fuck around with strings and shells. They always find a way, it's their fetish.
>Like, I've been thinking actually about a toolkit with a network transparency built in.
network transparency is bloat. why do you need that? my only goal is that my TCB doesnt get hacked. which means fundamental stuff like the GUI has to be kept as slim as possible
>It really wouldn't be much about trust (after all, people run untrusted shit all the fucking time), it would be first and foremost about the completeness of your solution, so…
It absolutely is about trust. I want to be able to browse thousands of data sources, and I don't have time to audit someone's code or even figure out whether I trust him.
BTW this is why web browser is cancer. Useful websites are simply data sources, or articles (which are still basically a single piece of data). The GUIs they deliver (even if HTML-only even without CSS) are invalid. They should simply output data. So we need code to convert their bullshit into usable data. Websites shouldn't make me interact with their data through their server or JS UI. They give me the data, and I manipulate it with my local data manipulation tools. A good example is dictd. The typical normalfag types in words in a search engine or dictionary website search box. Both of these are slow as fuck (up to seconds of page loading, and extra slowness if you actually have opsec and use onion routing), leak data about you, and give the website an opportunity to give you false information back. Also the data could change at any time without them telling you. Meanwhile, all I do is type "d apotheosis" in my terminal, and get pages of dictionary content immediately. No server is contacted and the result will always be the same when I look it up.
>Browsers are not slow, [some] webpages are. NoJS pages would load fast as shit.
Even plain HTML pages are slow for me, even on my most modern machines. You most likely have a different definition of slow. I also consider stutters as slowness. When that shit is stuttering and I'm typing and it's not instantly responding to my commands, my eyes blank out. And this happens 40 times per second.
>Nah, the GUI standards set by Xerox are pretty much universal TBH.
ive started using wasd and arrows for arrow keys because my right hand hurts, and assigning every key to whatever i feel like. at this point i dont even give a fuck anymore. i dont try to make anything compatible with the status quo. i just assume its retarded and ignore it. if there's a good idea, i take it
>>5980 >I mean each glyph is drawn pixel by pixel by hand
Bitmap fonts are legit, but vector fonts are more flexible.
>with simple fonts I don't see any squares insofar as they make it harder to read.
Good bitmap fonts are designed to not look dogshit, so no wonder. TBH we might just need more of them bitmap fonts. It looks like they're stuck in the 90s as far as adoption goes.
>There is a point. For images.
Yea, of course.
>no you could just double them
Vector fonts could go to any scale though, not just 2x. That's typographically important.
>and none of this requires the UI to hardcode font sizes
Well, DUH. I was talking that in case some UI hardcodes font sizes, we'd be fucked, our models both.
>How does it not make sense?
As I said, if you don't cut access to the Internet to your untrusted device, you better start fixing the problems of your device susceptability to being h@xed.
Although sure, if you don't give it the password for your account, it cannot go directly to your account. You'd have to make your proxy accessible to the Internet though, which is an infrastructure effort, strictly speaking, and for what? For the phone to leak some other bullshit? I think you need to get your priorities straight.
>Lol it's just an example of how UNIXtards would implement this shit. They _would_ fuck around with strings and shells. They always find a way, it's their fetish.
Come on now. Do your DNS server or web server pull tricks like those?
>why do you need that?
I don't want to give a shit where I run my software, really, and as stated above, shell has some usability deficiencies. It's good for scripting and for stuff you know how works, but if a well-designed GUI could be used instead with little to no overhead, I would take it, 'cause why not? You didn't type your post in the terminal, I bet.
>It absolutely is about trust. I want to be able to browse thousands of data sources, and I don't have time to audit someone's code or even figure out whether I trust him.
As long as you run to commercial search engines for your search results, you will find yourself lacking your "backend parsers" more often than you'd like to care. That's what I'm talking about.
With a simple browser you would be able to get a quick read of something pretty much all the time. Despite all the bullshit trends, sites usually return something meaningful without JS.
>The GUIs they deliver (even if HTML-only even without CSS) are invalid.
Well, my opinion on the whole thing altogether is that webpages ARE NOT GUI AT ALL. It's a collection of documents to READ. If you have a better idea of GUIing that shit, you will be doing EVEN MORE work on your backends AND frontends. I can already see you spending more time on maintaining the bullshit ad-hoc infrastructure than doing some work, and I don't like it one bit.
>A good example is dictd.
I use sdcv.
Dictionaries just favor the offline model altogether, so that's not a good counter-example. Web is for documents you don't have and probably don't even have the idea if they exist.
>When that shit is stuttering and I'm typing and it's not instantly responding to my commands, my eyes blank out.
Nothing's stuttering on the static webpage once it's loaded, like, c'mon. Do you want some quick POST replies or what the fuck are you even talking about?
Anyway, it seems like your problems won't be over if we ditch HTTP, so it's not really useful.
>ive started using wasd and arrows for arrow keys because my right hand hurts, and assigning every key to whatever i feel like. at this point i dont even give a fuck anymore. i dont try to make anything compatible with the status quo.
It's not about keybindings. It's about windows, input forms, buttons, checkboxes, radiobuttons, file dialogs, icons, menus and dropdown menus (maybe even pie menus), tabs - all that shit that you're probably using in one way or another, and it's unlikely you are revolutionizing anything in this regard.
>Come on now. Do your DNS server or web server pull tricks like those?
Yes actually
>You didn't type your post in the terminal, I bet.
of course not, I'm anti-terminal
the final solution is a console based around a graphical programming language, which may or may not obselete tailor-made GUIs.
>With a simple browser you would be able to get a quick read of something pretty much all the time. Despite all the bullshit trends, sites usually return something meaningful without JS.
I still need aggregated data. Like the ability to instantly search 1000 good warez sites. Also even just having consistent (quality) UI for every single website without shared session bullshit or wait times to fetch the GUI from the website upon each access (yes the web has caches and shit, but it's still fucking retarded), is a huge gain.
>Well, my opinion on the whole thing altogether is that webpages ARE NOT GUI AT ALL. It's a collection of documents to READ.
agreed
>If you have a better idea of GUIing that shit, you will be doing EVEN MORE work on your backends AND frontends.
A GUI I would make for example, is something to search for warez across 1000 warez sites. Another example is better GUIs for chans. Nanochan is the best chan but it's still shit since it's limited by what static HTML can do. OTOH all the other chans have some more interactive functionality via JS but they are slow as fuck and lack features and can't do basic caching and round trip reduction. Hell they can't even implement the captcha entry form correctly.
>and it's unlikely you are revolutionizing anything in this regard.
correct. my revolutions focus around purging UNIX braindamage and webshit and crapitalist software. everything has to be remade anyway.
>>5987 before someone misinterprets my shit. what i mean by anti-terminal is, there should be a _REAL_ programming language, and _THAT_ should be the shell. also an aside is that i believe textual languages are inefficient and we only use them because we developed 1 dimensional math notation because its easier to write with a pen. but for computers that restriction no longer exists (unless you care about writing source code on paper, which i dont)
at the very least, parentheses should be replaced with boxes (yeah it sounds stupid, but its not), and you should be able to blindly paste any data into your shell and make use of it without having to worry about pasting code or hidden bullshit like you currently do with any UNIX terminal.
>>5987 >Yes actually
Backstory?
> the final solution is a console based around a graphical programming language
And look at this person talking shit about "bloat", te~he~.
>I still need aggregated data.
So, you need to add a scraper/crawler to your mix. Great fucking day. I mean, it's not a bad idea, I just see even more work.
And once you fetch all the shit, you'll make up some reader for this, even though browsers can read offline pages just fine.
>all the other chans have some more interactive functionality via JS but they are slow as fuck and lack features and can't do basic caching and round trip reduction
What are you talking about? Ever tried the now-ancient Wakaba imageboard engine? TBH I think it's better than whatever buggy Nanochan software is partially because it was written by people who understand the Web somewhat. It has very minimal JS support, but works fine without it.
Now, I dunno what you mean by "slow" but generally those static-oriented light engines render HTML fast as shit. Web cache functionality is delivered by webserver software. You cache your static pages and it's fine.
>my revolutions focus around purging UNIX braindamage and webshit and crapitalist software.
Well, we were talking about GUI, so that's kinda pointless.
>>5990 >unless you care about writing source code on paper, which i dont
There is not much point in writing code on paper, but algorithms, building blocks etc of the project make a lot of sense to be written on paper.
If anything, the ULTIMATE programming language would be able to translate a flowchart into a working program. Well, the flowchart has to be well-formed in some way, but I guess you see the point. The visual programming, yaddayadda.
>parentheses should be replaced with boxes
Unless boxes mean something different, it doesn't matter, it's just a syntax flavor. We could insert some visual crutch to replace parens with boxes.
>you should be able to blindly paste any data into your shell and make use of it without having to worry about pasting code or hidden bullshit like you currently do with any UNIX terminal.
TBH I think it's more bitching than the actual issue. Shell is designed in a way that spaces separate arguments. You will have to design some other shit if you don't like spaces for delimiters, and you still might run into similar problems. Like, into even more problems provided your syntax would probably be more complicated.
>>5990 >a real programming language should be the shell
A programming language that is good for writing complex, efficient, low-level systems is not a good language for doing one off interactions in. There are a multitude of problems with existing shells, but their replacement should always be a better shell, not some language that wasn't designed for the task
>1 dimensional math notation
wot?
>because it's easier to write in pen
wot?
>for computers that restriction no longer exists
The reason we use text to program is because plain text does not hide anything. Programs are chaotic: a small change in one part changes the entire program completely. To hide any one part is to make the entire program incomprehensible.
>worry about pasting code in any UNIX terminal
For some contrived reason I was editing text in terminator. I copied some commands off the internet, and tried to paste them into vim. Vim entered them as text! I tried again, this time careful to make sure I wasn't in insert mode, but it happened again. I realized that terminator and vim were conspiring to make sure that I could not paste commands literally, using some in-band signalling. I switched to st and the problem went away. I have also used terminals (and shells) that will warn you before pasting, or remove any line breaks to prevent you from executing commands. I personally avoid these, because they solve a problem that is wholly fictional. The reason the problem persists is because there are too many people like me.
I agree with the avatarfag, vector fonts are much better. Vectors in general for graphics, even possible, are a much better solution than pixels.
The exception is, of course, high resolution images.
>>5953 >I don't see any reason for high resolution fonts.
It does makes sense. Read about legibility and readability studies and you'll see people get affected by how fonts are rendered. Offtopic: I've found the SAM method (Suitability Assessment of Materials) to be quite useful while writing stuff.
>Reading has always been a problem even before computers.
Actually reading symbols is a much more efficient way to interpret a language than through phonetics. With symbols you can 'suppress' subvocalization and get higher speed of interpretation (using RSVP, or "rapid serial visual presentation").
>>5974 >maximum optical resolution of human eye
That's complicated. See:
https://en.wikipedia.org/wiki/Hyperacuity_(scientific_term)
>>5980 >And this happens 40 times per second.
Actually it's normally 60Hz. Displays without PWM and with higher refresh rates are meant to mitigate this. You can't solve it by software only, unfortunately.
https://www.eizo.com/library/basics/eyestrain/ >>5990 >there should be a _REAL_ programming language, and _THAT_ should be the shell.
That's basically javascript, a JIT (just in time compiler) on the terminal. And that's a very bad idea for many reasons. Domain-specific languages like shell solves the problem much more efficiently, securely and with less complexity.
>>6004 >A programming language that is good for writing complex, efficient, low-level systems is not a good language for doing one off interactions in. There are a multitude of problems with existing shells, but their replacement should always be a better shell, not some language that wasn't designed for the task
Indeed, I'm talking about a high level language with a real console as part of the design goal (as opposed to shitty REPL like what everything now has). The main goal of modern PL design is composability, which is exactly what's needed for a console. Editing code by moving around syntactic elements is probably much more user-efficient than editing text. What domain do you have in mind? Parsing? That's what bash does and it sucks. In fact I never want to do parsing at all. I want strongly typed data structures and easy ways to manipulate them and pass them between OS processes. Look at this /proc/bus/input/devices bullshit in Linux. I'm parsing this to make my image viewer automatically find all the keyboards/mice. Why can't I just fucking do:
[aa]
x = getDevices()
map(devices,(x ->
if (Set.member(x.ev,EV_KEY)) {
List.append(keyboard,x.inputDeviceId)
}
if (Set.member(x.ev,EV_MOUSE)) {
List.append(mice,x.inputDeviceId)
}
)[/aa]
(note this is written in common tongue, instead of a good language)
Instead I have to parse that shit, convert the EV field to a number, and use bit operations to extract EV_KEY.
>>1 dimensional math notation
>wot?
>>because it's easier to write in pen
>wot?
all textual syntax is 1 dimensional. e.g
a(b(c(d(e(f)),g(h),i,j(k)),i(j),k(l(m)),n(o(p))))
the only way to read it is to parse it in your head. you cannot know what k(l(m)) is applied to without reading the entire expression to the left, or doing some other inferrences. you can't simply look at this and instantly know that k(l(m)) is the 3rd argument to b, whereas in pic related, you _can_
my hypothesis is that it's easier to program (and especially read code) with graphical syntax, or maybe the optimal is somewhere between graphical and textual. but the optimal is definitely not pure text, nor a plain text editor
textual syntax means people can freely format their shit and makes auditing hard
shitty code bases typically have 300 character lines or some shit that goes way off the right side of your text editor. you eventually get used to that, and then line 651 has some obscured form of exec("rm -rf ~") all the way on the right
also no matter what you do with text syntax, there's no easy way to know that something half way down the page is in the if statement at the top, e.g:
[aa]
if (y) {
if (x) {
for (..) {}
if (z) {
g{ }
}
... more crap
f(a);
if (s) {
}
...more crap
}
}[/aa]
how can you tell f(a) is inside the if(x)
only by counting braces, or trusting the person who wrote the code to indent properly (or trust your tools to auto format it, which doesn't even work if macros are present) and make the editor show some dots and follow those dots down from the if(x)
indented languages like haskell, python, or coffeescript also sort of force correct indentation but i wouldn't rely on them
>The reason we use text to program is because plain text does not hide anything. Programs are chaotic: a small change in one part changes the entire program completely. To hide any one part is to make the entire program incomprehensible.
Non-textual syntax doesn't hide anything either. At least not the ways I've implemented it (which are different than pic related). In fact, textual languages are full of ambiguities and hide stuff much better.
> personally avoid these, because they solve a problem that is wholly fictional.
Lol actually they probably don't solve the problem (which isn't remotely fictional). There is no way to securely paste text into a terminal and there never will be. The very notion of "securely pating text" is absurd and is not even a thing outside of UNIX, where pasting just works and has no problems. Security aside, I still want to be able to paste some text into a string expression in the console without having to look over it and see if it needs to be escaped, and can't even do that in a single REPL ever made.
Here is just one of the problems:
https://thejh.net/misc/website-terminal-copy-paste I don't see how this is a topic that merits discussion further than "it's broken". Text boxes (yeah yall are probably imagining Windows text boxes, but it doesn't have to be like that) absolutely solve this trivial problem, whereas any UNIX mitigation leads to open questions such as "is it secure yet?".
>>6014 >>Reading has always been a problem even before computers.
>Actually reading symbols is a much more efficient way to interpret a language than through phonetics. With symbols you can 'suppress' subvocalization and get higher speed of interpretation (using RSVP, or "rapid serial visual presentation").
I meant that reading is a problem for your eyes, because it hurts. And so you should probably avoid reading tiny text or symbols. Even examining any kind of visuals over and over is bad and will cause eye strain, and I'd assume wear and tear of your eyes and surrounding areas of the face. I'm talking people who read or look at any kind of information on the screen 4+ hours a day. But the least we can do is use bigger fonts.
>>And this happens 40 times per second.
>Actually it's normally 60Hz.
I wasn't talking about the refresh rate. I'm saying, when I'm interacting with a computer, and it doesn't respond immediately, I get pissed off and roll my eyes or lose consciousness for a tiny period each time, and this happens a ton of times each second. Anyway I was exaggerating.
>Displays without PWM and with higher refresh rates are meant to mitigate this. You can't solve it by software only, unfortunately.
>https://www.eizo.com/library/basics/eyestrain/ based Eizo shill
Does PWM make _your_ eyes hurt? Of course LCD vendors are shilling non-PWM monitors after selling them for 10 years. That's how they make money. Next they will say we need strobed backlights. Then in 20 years we will need 2000Hz non-strobed backlights. Then a different tech than LCD, because even IPS still has viewing angle issues. They also have "low blue light" as a similar feature which may or may not do anything and costs them nothing to do. Personally I think motion blur causes much more eye strain than some placebo-ass bullshit like non-visible flickering. And the solution to motion blur is to use timed "flickering" (backlight strobing) or a CRT. Or you could get a 2000Hz monitor, that would probably work too if you can support it.
The motion blur problem is not even debatable really. When an object on screen is moving, there are only a few things you can do:
>Be like a normal human and focus on the object. Your eyes will continuously fail to focus on it because it's not possible to focus on it.
>Look at a single point in the screen instead of following the object
>Train yourself to not focus on any moving object, which means you spend Xms looking at nothing, Yms where the blur is actually present, and Zms looking at nothing. so you're always spending X+Z ms longer than necessary staring into nothing, which adds up to around a half a second to a second each time unless you're really trying. where "looking at nothing" means following the object but not focusing on it, or looking somewhere else on the screen
On the other hand, PWM is just "oh its bad for your eyes because reaso1ns"
>That's basically javascript, a JIT (just in time compiler) on the terminal.And that's a very bad idea for many reasons. Domain-specific languages like shell solves the problem much more efficiently, securely and with less complexity.
absolutely false on all points TBH. particularily, it doesn't have to be anything remotely like JS
>>6014 >That's basically javascript
Just because a programming language is good, it doesn't mean it has to be javascript.
>a JIT (just in time compiler) on the terminal.
What's the problem with that? You do not need to use a JIT to just evaluate a snippet of code. Similar to this, LISP Machines had listeners which were REPLs which let you use a real programming language to interact with the operating system.
>Domain-specific languages like shell solves the problem much more efficiently, securely and with less complexity.
He was not arguing against a DSL, he just wants the shell to be a REPL for a good programming language. The language UNIX uses for the shell is horrid and seems like a hack that they continually expanded on.
>>6016 >Parsing? That's what bash does and it sucks.
Objection! Bash, and shells in general, does not parse.
>I want strongly typed data structures and easy ways to manipulate them
As long as your OS is written in C, I believe you're ultimately stuck with typedef struct or whatever C has.
>I'm parsing this to make my image viewer automatically find all the keyboards/mice.
Is it even a viewer's job, if you don't mind me asking?
>Why can't I just fucking do:
>keyboard
>mice
If I understood that correctly, that assumes there are some separate lists of keyboard and mice specifically crafted for your whims. That's "bloat". xD
>whereas in pic related, you _can_
Objection! I struggle with both. xD
I believe there is no working around visual complexity other than delegating it to the meaningful subroutines. The thing about C subroutines aka functions is their global scope, so we just have a long linear list of functions that is our program as far as code relations go. I love that.
Your example shows 5-layered visual complexity which is on the verge of unacceptable, I think.
>my hypothesis is that it's easier to program (and especially read code) with graphical syntax, or maybe the optimal is somewhere between graphical and textual.
I remember trying to work with something like Delphi IDE without any background other than Pascal language, and that shit was bonkers. I mean, it's not easy at all if you don't know what you're doing already. And I believe that would be more suitable to your needs, like, it has certain amount of visualization to it.
So, all in all while I agree that vision is like our #1 sense and we have to exploit it rightfully, I also believe that as fas as programming is concerned, imagination comes before vision. When you look at the code, you don't really parse syntax, you delve deep into the state manipulation if that makes sense.
>At least not the ways I've implemented it (which are different than pic related).
Oh c'mon now, you could've just shown what YOU did.
>Here is just one of the problems:
All due respect, but how do you solve this in general? Someone gives you the legit code in your new shell with hidden obfuscated "rm -rf ~" in it, like, maybe not through the shell actually, but through the actual call to the something like remove() function. You paste it and then boom goes the dynamite. Like, one way to mitigate this would be to carefully limit the program (shell) abilities, but that calls for the different OS entirely, not just some shell replacement. Also that would require you to raise password-protected privileges to e.g file manipulation MANUALLY EVERY TIME YOU HAVE TO do that, and how many users are going to do that now? Will you do that? I have my doubts. But that's just one idea though.
>text boxes absolutely solve this trivial problem
Text editors absolutely solve this trivial problem.
Anyway if I understand text boxes correctly, you have to set up them first, meaning, turn on the textbox mode or something. Pretty much equivalent to a text editor.
>absolutely false on all points TBH. particularily, it doesn't have to be anything remotely like JS
Well, JS has functions as first-class objects, so that's a start for you. xD
>>6016 >Even examining any kind of visuals over and over is bad
We don't have a choice. We are in the age of information and using the visual cortex is essential for processing these informations.
>But the least we can do is use bigger fonts.
Too big and it will reduce legibility. Hence why I suggested the SAM method. There's also the LaTeX module called microtype, which helps with some stuff (the 'rules' also applies for non-latex):
https://ctan.org/pkg/microtype >when I'm interacting with a computer, and it doesn't respond immediately
That's a issue with the windowing system. Wayland supposedly solve this kind of thing. Also low-latency scheduling (such as MuQSS on Liquorix).
>based Eizo shill
I don't give a fuck about Eizo. I use a LG display. I just shared their link because they got some useful information about how displays affect eye strain.
>That's how they make money.
It is. But, there is a scientific basis for not using PWM if you work for too many hours.
>They also have "low blue light"
This one is bullshit. Blue light does affect melatonin levels, but putting a yellow filter above the display won't do anything (without aggressively shifting colors). If people really care about that they should be using red glasses on your face, not on the monitor.
>I think motion blur causes much more eye strain
Motion blur is good for the eyes, if used correctly. Read about the "180 degree rule" (when shutter speed is exactly the double of frames per second).
>And the solution to motion blur is to use timed "flickering" (backlight strobing)
Hummm... not sure about that. But newer technologies might not need 'backlight' (such as the 'Crystal LED' from Sony).
>PWM is just "oh its bad for your eyes because reaso1ns"
This is not true. Read about non-PWM more.
>it doesn't have to be anything remotely like JS
The way you explained: yet, it does. Make a counterargument instead of just saying "u wrong".
>>6020 >Objection! I struggle with both. xD
But you can instantly see k(l(m)) is a parameter to b, right?
it is a bit hard to read on the left side since there's so many lines. that's another problem. it can be solved completely by trees but i get the feeling trees would suck.
point is, i think this kind of 2d syntax allows me to read more code (that other people wrote, which i have no control over) faster than if it was textual
>Your example shows 5-layered visual complexity which is on the verge of unacceptable, I think.
probably, but i come across shit like this in real code all the time. also simply moving things out to variables doesn't necessarily help, since it can cause alias hell
>If I understood that correctly, that assumes there are some separate lists of keyboard and mice specifically crafted for your whims. That's "bloat". xD
nigga you'll see what i'm talking about when i post the code. i need to use the keyboard and mouse directly because the program works without X11. and under X11 it just tells X11 to fuck off and get out of the way
>When you look at the code, you don't really parse syntax, you delve deep into the state manipulation if that makes sense.
i think i know what you mean but i mean you cant really reason about shit until you're certain what the code does. im talking about auditing and verification, not simply getting a general sense of what the program does or monkeypatching
>Oh c'mon now, you could've just shown what YOU did.
well it would be harder to explain because i would have to explain the syntax first.
>All due respect, but how do you solve this in general?
by not UNIX. imagine pasting something into a textbox (that only holds text, and not with any special bullshit like metacharacters). and then there is a key like ctrl+e to execute it. obviously you can just look at what got pasted into the textbox before deciding to execute it. the browser should be fixed too though, a webpage should only copy what's visually highlighted. of course that will never happen because web is full retard
>>6021 >Too big and it will reduce legibility. Hence why I suggested the SAM method. There's also the LaTeX module called microtype, which helps with some stuff (the 'rules' also applies for non-latex):
Nigger all this fancy ass text processing stuff is full of vulns. It would have to be super good in order to consider using it for a secure OS (which is a requirement for most modern computers).
>This one is bullshit. Blue light does affect melatonin levels
I thought they were saying it causes stuff like cataracts. Anyway blue light always hurt to look at for me so I use redshift and remove blue stuff from my GUIs.
>Hummm... not sure about that. But newer technologies might not need 'backlight' (such as the 'Crystal LED' from Sony).
The backlight is just an implementation detail. The way to remove motion blur is either use an insanely high refresh rate, or to turn on the display for only a tiny fraction of the time. For example at 85Hz you could do 1ms on and 10.76ms black.
>The way you explained: yet, it does. Make a counterargument instead of just saying "u wrong".
because im not even sure about what you mean by "like JS"
here's what such language should be like:
>simple reference counting implementation
>code can be compiled or interpreted. interpreter can use any code right away regardless
>no tryhard JIT shit or other such retarded attempts to do DSP from a high level language
>code,types, and any other identifiers are addressed by content (e.g you may just have a hash instead of a name). so there are no name collisions. names are mapped on separately
>no strings or string operations. they can be added as an extension, but they arent even needed unless parsing some pleb shit like a website
>no global state
>no text. language is visual boxes and shit. who cares if it looks bad. i already implemented it and it works. there's no ambiguous syntax either, whereas mainstream PLs are only fully understood (even on a syntactic level) by 1 in 100K programmers
>minimal amount of constructs
>able to copy and paste strongly typed data from anywhere to anywhere, even to/from "web" (not current web, the replacement for it) pages.
>ability to send and receive data over channels to other computers (using prefix-free encoding, which is more efficient than any plaintext bullshit or even ASN.1 without even trying)
>OS only communicates to you with strongly typed data. no parsing bullshit. no bitstring bullshit (but we're still more space efficient than 99.99% of other shit)
>console can be extended to allow user to visualize and manipulate arbitrary data types. for example instead of Cons 1 (Cons 2 (Cons 3 Nil)), the GUI will show you a list like visual like: [1,2,3] (example but real implementation uses graphics), but we can do this for as many data structures as we went instead of a few builtin shits like list and map like most PLs do
and i have no idea what security problems you are referring to. the entire UNIX shell _IS_ a security problem, which my languages completely avoid by not monkeying around with strings, global state, or global namespaces
>>6020 does delphi have some way to manipulate code other than text editing (typing/cut/copy/paste)? I dont remember any such thing. i missed delphi anyway, i only experienced visual studio back then, but i figured delphi is the same shit. lets you visually make GUIs and nothing else.
both bloated as fuck but who knows maybe there are some good ideas in them that i missed
found a pic while searching delphi as an example of what i was talking about with if statements and such. the "if EraseLRCorner then" block could have stuff in it way down the page (off the screen in that screenshot) and the only way to know it's part of the block is to count "begins"/"ends" or trust the indentation and use a ruler (and delphi like every IDE has an option to draw lines for you)
>>6042 >Nigger all this fancy ass text processing stuff is full of vulns.
WTF are you even talking about? We are discussing how to make text more legible/readable. It has nothing to do with automated text processing, but with how you present the symbols (typography, size, kerning, line spacing, colors, etc).
>no text. language is visual boxes and shit.
Bloat: the definition.
Btw, how are you gonna make these visual boxes using Xorg, if you want a "secure OS (which is a requirement for most modern computers)"? You have to start redesigning the whole windowing system first, because Xorg is possibly the biggest hole in unix-based systems. I suggest you start reading the Nitpicker GUI from Genode:
https://www.acsac.org/2005/papers/54.pdf
>>6044 >Bloat: the definition.
no it's actually easier to implement than a text editor+lexer/parser
the word GUI doesnt automatically make something bloat. besides, only the total system complexity matters, and beating any modern PL/OS is not even remotely hard
> Btw, how are you gonna make these visual boxes using Xorg
i dont even bother with Xorg, just write to framebuffer directly. i plan to generalize the visual language as the basis for all GUIs, so you could call that the window system
i take dat paypa
>In 2004, the overdue discussion about what OS design-
>ers can do about GUI security was brought back to life by
>J. Shapiro et al. with the presentation of the EROS Trusted
>Window System (EWS) [22]. However, the presented ap-
>proach supports only a dedicated set of applications com-
>piled for the particular platform while the broad range of
>mass-market software is not available.
stopped reading there. i don't care about supporting legacy shit. it should all die. about 20 years ago we were at the point of emergency where everything needed to be deleted. and it still never happened.
>ignoring based Shapiro
anyway here's an idea: instead of letting windows needlessly move their asses all over your screen, you create a new blank window and then assign it to a program. so once that program launches (in 30 seconds from now because it's a piece of AIDS), it can only draw to that window (enfoced by both API and further by using PL/OS isolation). this also solves the focus stealing problem. GUI security is moot and not even hard to do perfectly (when building a system from scratch)
>>6042 >But you can instantly see k(l(m)) is a parameter to b, right?
Not really. It's kinda the same with following the line around. Maybe a bit easier considering that we CAN follow the line, but the whole example is handcrafted anyway. Like, the code written that way is clearly a bit obfuscated, and don't tell me your shit will be immune to obfuscation.
>also simply moving things out to variables doesn't necessarily help, since it can cause alias hell
Dude, like, SCOPE? Jeez.
Only functions that are global (in C they are all global) will be susceptible, and that's probably one part of the answer to why C++ was created LMAO.
>nigga you'll see what i'm talking about when i post the code.
Yeah, that would be much appreciated.
Anyway yeah, regardless of anything, it's weird that you HAVE to parse that shit. Can you, like, get that information from kernel directly? If it's not in the kernel ready, maybe you could make a module that could actually give you raw data structures for specific devices instead of parsing some text files. I mean, it's doable and it's hardly the issue with the OS itself, if the data structures instead of strings are all what you want.
>im talking about auditing and verification
That's the first time we hear about it. xD
So, how exactly does visualisation help in auditing and proving? I mean, sure, auditing may be better because of the visuals, but proving - nigga, please~
>well it would be harder to explain because i would have to explain the syntax first.
Just show it already. If I have questions, I will ask.
>and then there is a key like ctrl+e to execute it
That would work, but listen, that functionality could be hacked into a shell too, probably. Like, by default it ignores special characters and pastes everything as-is, then a special keystroke could execute the mess.
Not that it's a necessarily good idea, about hacking this shit in, I mean, but it addresses your problem without redesigning the whole software stack.
>>6046 >the word GUI doesnt automatically make something bloat
If you bring the graphic engine into the core of your language, it does. Also you're gonna manipulate internal data of your program (doing the job of a linker somewhat) with UI itself, and that's like dependencies.
>no it's actually easier to implement than a text editor+lexer/parser
If I understand correctly, your program is going to have some intermediate form which consists of like flowcharts in some binary form. That form itself is already bloat. Not that it's a bad idea. If you avoid that form, you have to transform the UI state into the working executable somehow anyway, which would also be bloat. Like, sure, you don't have to parse the text but the complexity of the logic of code generation won't go anywhere, and I think it's more complex TBH, especially if we want to have an optimizer. Like, parsing is straightforward once we have the grammar down. Code generation, on the other hand - yuck! - like, for example, we could use memory for a variable, we could load it to registers in some places and we could just replace it with a literal constant if possible. And that's just a trivial example from a top of my head.
As someone who worked as a UI designer for a while, I can pretty much say it's because most of the software's design starts and ends with developers. Devs only see the value of software to be its functionality, while clients only see the software as the UI - forcing bosses to pair devs with UX designers, who are able to manage the designs.
>>7315 false. software developers are retarded period and incapable of writing anything coherent or working. and UX is a meme field based on a fabricated reality
Let's say I consider the meme about FOSS programs having perplexingly retarded visual interfaces true. Are there any resources out there to learn the ropes of good design then? Something between "just fork it and make it better" and "we design UX with love, not UI" so it's actually productive. I know there is the Nielsen Norman Group for the web, but that's only one resource.