/g/ - Technology

install openbsd

[Make a Post]
[X]





the truth about unix niggers and their shit C language Nanonymous No.5649 [D][U][F][S][L][A][C] >>5673 >>5692 >>5739
File: fd3ca55062e63a15963c49be0bd4d16d371ece96b9c39c2c2085959538142291.png (dl) (7.12 KiB)
https://news.ycombinator.com/item?id=10917414
>The price of commercial compilers wasn't the only issue with Ada. It never had any friends in the UNIX culture.
>UNIX culture always ignored safer system programming languages from the 60 and 70's.
>If the hacker culture bashes Java for being verbose, what would they say about Algol languages like Ada?
https://www.blaisepascalmagazine.eu/academy/programming-in-c-or-pascal-english/
>Of course, the poor legibility of C is a direct consequence of the use of special characters. Here you can see that the authors just did not have the experience that Wirth had, as many operators are poorly chosen. The symbol ^ for differencing in Pascal reproduces the nature of a pointer much better than the star, which is also used as a multiplication sign. It has done better with the “->”, with which composite components can be addressed. And I have never understood why == and != is used for comparisons, instead of = and <>. I often have the impression that the sole aim of C syntax is to be different from Algol or Pascal, and not more easily comprehensible.
>Perhaps the original aim was to save space, but this means that C programs have to be commented much more, which wins you nothing. The partially cryptic nature of C actually attracts some programmers who love this; there are even programming competitions in C such as “Obfusticated C”. Here programs that no one can read win prizes. In this example the need to avoid a buffer overflow and append the binary zero makes the C program even longer than the Pascal program
now it all makes sense
unix niggers choosen C as their language because it is unreadable and they can pretend to be evil dark hackers that type strange symbols into computer. same like when they claim that CLI is better than GUI, in reality they prefer CLI because they can memorize some nigger commands from manual and then feel like DARK HACKERS that control computers with strange letters and stuff
unix niggers care about how they are perceived, they don't care about productivity or software safety and quality. so unix niggers are similar to objective-c apple gay starbucks soy devs, they are just from different team, both teams hate each other but in reality both teams are bunch of dumb niggers
hacker/unix culture = dumb shit niggers that want to show off that they memorized some cryptic magic commands and they are DARK HACKERS.

Nanonymous No.5650 [D] >>5673
http://archive.adaic.com/intro/ada-vs-c/ada-vs-c.html
>In contrast, C and C++ emphasize ease of writing, rather than ease of reading, resulting in a greater use of operators, the use of terser notations (the conditional expression, for example), and an absence of keywords (most noticeably in declarations). This makes C++ programs harder to transmit and to maintain and, therefore, more expensive.
>One of the most striking aspects of the C and C++ communities is the number of language idioms that are part of the programming culture. Idioms are typically short, cryptic expressions with great power, a notion most intensely developed in the APL community. The study of idioms has produced such entertaining books as The C Puzzle Book and it is clear that C and C++ programmers enjoy enormously the 'guess what this computes?' game. The question isn't fun, though, if the answer isn't in some way unexpected! These languages systematically violate the informal rule of design known as the principle of minimal surprise.

Nanonymous No.5651 [D] >>5691
you think you are not faggot like soyboys because you use "manly" unix and C instead of apple, rust, javascript? if you are so manly then show me how many women and teens have you fucked. you can pretend to be manly with your shitty unix and CLI programs but I know you are incel and loser, I bet that even Steve Klabnik gets laid more than you and fucks more trannies that you ever will.
for men, it is not important how many kernels he will compile in life, but how many bitches will he fuck, how many kids will he deflower. you are clearly failing here, so maybe unix and C were not a good life choices? install windows and get laid, get a life.
do you even know how 13-year-old pussy smells and tastes like? if you don't then you failed at life, forever. it doesn't matter how many lines of code will you write, you will be loser forever. then only way to win is to deflower, penetrate, impregnate hot 13 year old virgin

Nanonymous No.5652 [D]
>lispweenies

Nanonymous No.5654 [D] >>5657
A lot of UNIX weenies are boomers who grew up on UNIX and are unable to imagine that anything better can even exist.

Yeah, when I was a boy we used to have to toggle switches on
a com panel. Sure the machine was big, slow, cost millions
of dollars, used up enormous amounts of electricity and
generated enough heat to warm a football stadium, but we got
the job done and we LIKED it! None of this namby-pamby RISC
goobledy-floo, where you can wait a few seconds to see your
compile finish.

In my day, we used to go away for WEEKS because we knew the
average turn-around time on a batch day was 3.5 days and the
average availability of the machines was 18 hours out of 24.
Most of us have to wear glasses now because of years
squinting at hex dumps printed by a failing ribbon on a line
printer that we had to keep running with wire and chewing
gum. Most of us have the pallor of albino cave fish from
years spent in climate controlled basements tending machines
that cost 500 times as much as we made in a year.

None of this simpering about how much my portable PC weighs
or how much Sun raised its prices. In my day we used to
thank GOD that we had a tool like COBOL, a language that had
was supposed to be for humans and whose programs looked like
assembly instructions for a Y-12 bidirectional wing-wang
oscillator.

These days you hear these whining wimps going on about
"functional" languages and "logic programming" languages and
"object-oriented" languages. Pfah! We had a real job to do
and we didn't have any choice: we had to use COBOL because
we had spent $50,000 on the compiler and it was the only
language that ran on our ancient, outmoded hardware
supported and we had no money left after trying to maintain
hundreds of thousands of lines of poorly written, poorly
structured, non-portable code that cost us thousands of
hours and dozens of burnt-out programmers. There was a real
job to be done and we did it and WE LIKED IT!!

Nanonymous No.5657 [D] >>5696
>>5654
sounds like interesting times. But did you notice how much worse the entire rest of the world has gotten in the meantime? You should be bragging about having married and then not divorced your first love in high school, not about how COBOL > RPG.

Nanonymous No.5659 [D]
The quote following my post was sarcasm if you couldn't tell.

Nanonymous No.5673 [D] >>5674 >>5693 >>6048
>>5649
A poor imitation of the UNIX weenie poster. And thumbnail posting as well. Dishonorabu.
>UNIX culture always ignored safer system programming languages from the 60 and 70's.
Simply because you could understand hardware back then and worrying about squeezing out every bit of performance was important. That and the physical size of the typed program mattered as well. That was before people even dreamed about computers being widely accessible for abstract tasks or CIA niggers spying on you.
>I often have the impression that the sole aim of C syntax is to be different from Algol or Pascal, and not more easily comprehensible.
A problem still present and it's probably not going away, UNIX or not. Blame retarded investors who need to be wowed by shiny and different things.
>same like when they claim that CLI is better than GUI
How exactly does this relate to the legibility and safety of your program? A GUI is most efficient when it is tailored to the use of a person that is most likely to need it. If you simply put ALL available options in a single interface design, it will look like shit and be unusable by anybody except the people who have what is approximately the CLI knowledge. You just sound butthurt about nothing in particular.
>hacker/unix culture = dumb shit niggers that want to show off that they memorized some cryptic magic commands and they are DARK HACKERS
Seems to have missed the obvious criticism of information about CLI tools being available only in terse man pages or badly written info manuals (in the case of Linux). Or were you too retarded to get to this point in using them? Apple has good GUI design, by the way. It's just that people don't want to ape it and apply it to open source software, for some reason.

>>5650
If you take a source from Blaise Pascal Magazine, it's going to be favorable towards Pascal. If you take a source from AdaIC, it's going to favor Ada. Not saying they are bad languages or wrong about C/C++, but your choice of links is questionable. I also didn't mention it before, but both of them lost the early race because their compilers were (and for Ada, still partially are) not available for free and you had to shill out a lot of money to have them.

The real problem you seem to be missing is accelerationism in programming. There was a lot of useful concepts developed along the way to $current_year that seem to be lost in time and nobody, even open source hobbyists who would benefit from it, are looking back to try and piece it together. And, since a lot of programming is now just slapping together third party library functionality, it is a monumental task. All that happens is some of these lost concepts get bolted on to languages over time or get shittily adapted by some new hipster language. I think it was Dijkstra that said how knowing computer science theory is important because all that happens in the software world is people take chunks out of it, move them around, and proclaim they've discovered a new paradigm. But that would no longer be just about UNIX. Where is the editor implemented entirely in Lisp, weenie?

Nanonymous No.5674 [D] >>5694 >>5715
>>5673
>(and for Ada, still partially are)
I have actually looked for Ada compilers, and all I can get my hands onto really is GNAT, the GNU Ada compiler. And guess what? Ada runtime library is written in C LMAO. Not that it shouldn't be if our use-case is some UNIX-like OS, and apparently (obviously) you can have some subset of Ada in pure Ada, but the whole situation is just nonsense. If the advantage of the language safety should be taken at all, it should be system-wide. So, embedded systems with some Ada subset, here we go, I guess.

Nanonymous No.5675 [D]
>Ada runtime library is written in C LMAO
Or maybe not exactly like that, but it definitely links to C Standard Library and some others.

Nanonymous No.5677 [D] >>5678 >>5694 >>5715
>hackers disliked Ada because it's safe
Or it failed for the same reason Multics failed.
>the poor legibility of C
Except type declaration mixing pointers and type qualitifer like const/restrict, I don't see any problem with C's syntax.
It's also quite ridiculous coming from someone using a language that can't be read without an editor having matching bracket highlighting
>GUI > CLI
Show me how easily you compose those GUI programs. CLI + good manpages is better than a GUI with menus acting as a poor documentation, at least in the long term.

UNIX is just an alpha for Plan 9, anyway.

Nanonymous No.5678 [D] >>5682
>>5677
>Or it failed for the same reason Multics failed.
Ada didn't really "fail" as in "not have any non-meme application at all".
Anyway, what's that about Multics?
>Except type declaration mixing pointers and type qualitifer like const/restrict, I don't see any problem with C's syntax.
How is that a syntax problem, anyway? Do you maybe mean the weak typing issues, like a lot of implicit typecasting in C? I mean, that's the exact shit that would cause some obscure bugs, right?
Though (*()*()*())() nonsense actually kills me too. You have functions in global namespace, why the heck would you make shit like pseudo-OOP or whatever in C is beyound me. I mean, it's legible if you don't go too deep, but some people are just perverted.

Nanonymous No.5682 [D] >>5684
>>5678
>Anyway, what's that about Multics?
Very big and Entreprise(tm). It was so bad that something as flawed as UNIX managed to take off just because of of the post traumatic bloat disorder caused by Multics.
>How is that a syntax problem, anyway?
I meant that "const char *const restrict var" is horrible to read and understand what qualifies what.
You're right that function pointers can be ugly too, but they can be very useful. I use these for some simple methods like print, free or cmp in "generic" (void *) structures but it has to stop here, indeed.

Nanonymous No.5684 [D] >>5687
>>5682
>It was so bad that something as flawed as UNIX managed to take off just because of of the post traumatic bloat disorder caused by Multics.
So, the same shit happens with Windows now, but also with glowniggers.
>I meant that "const char *const restrict var" is horrible to read and understand what qualifies what.
Oh, yea, I always have to refer to docs for that, because I prefer not to use it anyway. The C type system is a list of guidelines, not requirements LOL.
Some random thoughts on this by suckless cucks, in scc README files:
> - const: The definition of const is not clear in the standard. If a const value is modified the behaviour is undefined behaviour. It seems it was defined in order to be able to allocate variables in ROM rather than error detection. This implememtation will not warn about these modifications and the compiler will treat them like normal variables (the standard specifies that a diagnostic message must be printed).
>restrict: This qualifer can only be applied to pointers to mark that the pointed object has no other alias. This qualifer was introduced to be able to fix some performance problems in numerical algorithms, where FORTRAN could achieve a better performance (and in fact even with this specifier FORTRAN has a better performance in this field). Ignoring it doesn't make the compiler non-standard and in almost all applications the performance will be the same.

Nanonymous No.5687 [D]
>>5684
const is a documentation and static checking tool, not a performance one.
restrict is almost always unecessary, but when you need it, it better be here.

On a fun note, anyone noticed how niggerfaggots using js/python needed some time before understanding why static typing is a feature and not a chore? I mean, look at these retards trying to fix their disaster of a language with mypy/typescript.

Nanonymous No.5689 [D]
>7.12 KiB

Nanonymous No.5691 [D]
>>5651
>do you even know how 13-year-old pussy smells and tastes like? if you don't then you failed at life, forever. it doesn't matter how many lines of code will you write, you will be loser forever. then only way to win is to deflower, penetrate, impregnate hot 13 year old virgin
this

Nanonymous No.5692 [D] >>5694
>>5649
>I have never understood why == and != is used for comparisons, instead of = and <>
Well that had cleared any doubt about whether this retard has a clue about what hes talking about

Nanonymous No.5693 [D] >>5699 >>5701 >>5715
>>5673
>Simply because you could understand hardware back then and worrying about squeezing out every bit of performance was important. That and the physical size of the typed program mattered as well.
you can say that about 60s and 70s. but how about later times or today? unix niggers still use shit languages like C and C++. and pascal is pretty old
>How exactly does this relate to the legibility and safety of your program?
it doesn't. it just shows another unix brain damage
>If you simply put ALL available options in a single interface design, it will look like shit and be unusable by anybody except the people who have what is approximately the CLI knowledge.
that's not how you design GUI
>Apple has good GUI design, by the way. It's just that people don't want to ape it and apply it to open source software, for some reason.
unix niggers don't develop GUI because they don't give a fuck about users and usability of their shit software. they just want to feel special, like hackers, because they type some magic commands into black screen
>If you take a source from Blaise Pascal Magazine, it's going to be favorable towards Pascal. If you take a source from AdaIC, it's going to favor Ada. Not saying they are bad languages or wrong about C/C++, but your choice of links is questionable.
they use arguments, you can counter their ones
>I also didn't mention it before, but both of them lost the early race because their compilers were (and for Ada, still partially are) not available for free and you had to shill out a lot of money to have them.
but now it changed. why won't unix and open source community adopt those superior languages?

Nanonymous No.5694 [D] >>5701 >>5739 >>5751
>>5674
>I have actually looked for Ada compilers, and all I can get my hands onto really is GNAT, the GNU Ada compiler. And guess what? Ada runtime library is written in C LMAO.
there is Pascal/Object Pascal compiler, free and open source, that is made in Pascal. you can use it instead of Ada

>>5677
>I don't see any problem with C's syntax.
uses some stupid small symbols instead of words
pascal "array" c "[]"
pascal "procedure" "function" c " "
pascal "not" "and" "or" c "!" "&&" "||"
this is only good when you write code because it's fast and low effort. but then when you want to read, understand, improve, debug the code it's unreadable and cryptic
also:
-C uses same symbol for pointers and multiplication.
-strings are not strings, they are arrays of char
-array is a pointer at same time. even fucking integer (int) and pointer are basically same thing
-C makes you define variables and types wherever you want, pascal forces you to one place

>It's also quite ridiculous coming from someone using a language that can't be read without an editor having matching bracket highlighting
what language?
>Show me how easily you compose those GUI programs.
you don't "easily". that's the point. even monkeys and nigger can design CLI, it's just a long list of all commands. that's why unix niggers prefer it, it doesn't take any design and effort to make CLI
if you want to make GUI, you can to make some analysis and design. need to think about what functions your program has, how to control them in easy and clear way. then you make some prototypes on paper, then implement it in GUI designer, code it.
it takes effort but in the end there is great effect, user just starts the app and need to click a few buttons to get great results instead of losing many hours of his life for nigger manuals
>CLI + good manpages is better than a GUI with menus acting as a poor documentation
properly designed GUI is not just list of all commands in menu. but even stupid menu is better than CLI because it at least list the commands and group them by category (menu root item)
>at least in the long term.
CLI is shit in both short and long term
you lose your time for memorizing stupid commands, sometimes just to use the app a few times. even if you use the app often, you will have to read fucking manual again and again, if you want to something different than last time
CLI is only good for programmer, it is always bad for the user

>>5692
>>I have never understood why and != is used for comparisons, instead of = and <>
>Well that had cleared any doubt about whether this retard has a clue about what hes talking about
no, it didn't. do you have argument for
and != instead of = and <>?
it is bug prone to use = for assignment and for comparison, often in C programs someone uses = instead of

Nanonymous No.5696 [D]
>>5657
>A lot of UNIX weenies are boomers who grew up on UNIX and are unable to imagine that anything better can even exist.
This.

Nanonymous No.5699 [D][U][F]
File: 2b2c7770fd5c5b0fff4bc827721e95447e98cf79cd2f9a00cccce2bf08c58787.jpg (dl) (1.84 MiB)
>>5693
>Physical size of programs does not matter today
For high performance applications, the physical size of the application still matters today. Instructions are cached, and ICache misses will cause serious slowdown. Related - this is a big problem with C++ - template expansion (monomorphism) causes code bloat. We have some techniques like PGO to fix this (I remember M$VC will reorder functions to give cache locality) but they are in theory inferior to higher-level languages with JIT where the run-time information is superior to whatever profile you can cook up beforehand.

Nanonymous No.5701 [D]
>>5693
>>5694
>but now it changed. why won't unix and open source community adopt those superior languages?
>there is Pascal/Object Pascal compiler, free and open source, that is made in Pascal. you can use it instead of Ada
Pascal is shit.
Some of the shortcomings like the lack of break/return, poor low-level access and the actual lack of typecasting in any standard seem like a bad deal. I WANT this unsafe shit.
I remember coding Pascal @ school, and it was horrible compared to C.
I dunno, maybe Ada is better. It certainly looks more mature.
>pascal "array" c "[]"
>pascal "procedure" "function" c " "
>pascal "not" "and" "or" c "!" "&&" "||"
These are minor or even questionable TBH. A lot of alphabetic keywords tend to mesh together.
The guy you were replying to addresses more legit concerns.
> -C uses same symbol for pointers and multiplication.
This is more of a concern for a compiler developer really. Here comes the lexer hack LMAO
> -strings are not strings, they are arrays of char
And nobody gives a shit. Like, what the actual fuck with all those C criticism about that shit. Null-terminated strings are elegant as fuck. Any solution requiring some elaborate type for a string with a size or whatever would be shit as far as backward/forward compatibility goes at the very least.
> -array is a pointer at same time. even fucking integer (int) and pointer are basically same thing
I agree that weak typing in C is probably too weak, but it usually works. I mean, it WILL warn you when implicitly converting integer to pointer or something.
>-C makes you define variables and types wherever you want, pascal forces you to one place
I am not sure if you know what you're talking about. C89 standard does not allow to declare variables randomly within a block, for example. Anyway it's always a good style to define your types or whatever in the actual header file and not randomly somewhere.
>it is bug prone to use = for assignment
It really isn't unless you come from some language background that uses = for comparison LOL. It really is a matter of taste. If you strongly distinguish between assignment and comparison (the ideas), you shouldn't have problems.
Pascal uses := for assignment IIRC. I don't remember how construction "for i = 1 to 1000 do" in that language behaves though. Maybe Pascal will raise some kind of error because it distinguishes between assignment expression and comparison expression. C does not, but that is NOT a syntax issue.

Nanonymous No.5715 [D]
>>5693
>that's not how you design GUI
No shit. But do you understand what I was getting at there? That, using a GUI, you will, with time, come across a problem that might not have the tools available in that interface. CLI, by having all available program manipulations, enables a power user to do everything possible. But a casual user might find it confusing. Which is why both should exist for every more or less complex program.
>unix niggers don't develop GUI because they don't give a fuck about users and usability of their shit software
Clearly wrong, as you have GUI interfaces for a lot of programs. The problem is that they are bad, because they are designed by people who prefer CLI. We have Tyson Tan for logos, similar people are needed for open source GUIs to make them non-autistic. Since you seem to be railing against the terminal, it's a niche that you can always fill.
>but now it changed. why won't unix and open source community adopt those superior languages?
Because of what I said about software design being the slapping together of third party library functionality. If most of your code will be not in your preferred programming language, why not just roll with what they're written in? Only rustrannies seem to be working on rewriting basic tooling and libraries, which is a shame.

>>5674
>So, embedded systems with some Ada subset, here we go, I guess.
If you look at Ada's partners and the companies they list as using the language (Airbus, NASA, DoD, e.t.c.), I'm pretty sure embedded is exactly what it's used for in the commercial industry.

>>5677
>UNIX is just an alpha for Plan 9, anyway.
Plan 9 is just an alpha for Project Oberon.

Nanonymous No.5717 [D]
C sucks, Ada's better, Lisp is bad for other reasons.
The important thing is to adopt Prolog
Prolog syntax best syntax. To logically seek goals in a universe of facts and rules > all.
shitty_language(L) :-
evaluation_model(L, E),
( E = sequential_execution_of_statements;
E = sequential_execution_of_words;
E = evaluation_of_single_complex_expression;
E = too_many_parens ).

?- shitty_language(c).
yes
?- shitty_language(ada).
yes
?- shitty_language(lisp).
yes
?- shitty_language(haskell).
yes
?- shitty_language(prolog)
no

QED. None of you are without sin.

Nanonymous No.5739 [D]
>>5649
>unironically linking to hacker news
>tranny tier complaining about irrelevant aesthetic bullshit
>oh no the equal sign isn't mathematical equality
>(nevermind that neither is the boolean operator
>thumbnail sized image
stay on ycombinator

>>5694
>array is a pointer at same time. even fucking integer (int) and pointer are basically same thing
Now you've moved from aestetic complaints to semantic ones. This shows you have no understanding of c, or why it was/is successful.
>C makes you define variables and types wherever you want, pascal forces you to one place
>makes you
stupid nigger, it makes you do nothing
>pascal forces you to one place
K&R did the same. Some people still follow that convention. Most people realize it's drop-dead retardation, but then I suppose there's a reason you're promoting it.

Nanonymous No.5751 [D]
>>5694
Yes actually. = stands for assigment while tests for equality and returns true/false. You cant use the = sign as it is already taken. It means something else already. <> could be used but != fells much more as a complimentary sign (! is used for negation) to than <> does. < and > are used to test for greater than/lesser than which has nothing to with equality. Over all the way we have it is much more intuitive and logical.

Nanonymous No.5755 [D] >>5764
>words better than symbols for operators
Does pascal uses add or sub instead of + and - (of course it doesn't)? Even Tcl uses symbols where appropriate, this has the consequences of not needing syntax colouring to easily differentiate between operators and operands.
Consider me rustled if this is supposed to be an ebin droll.

Nanonymous No.5764 [D]
>>5755
behold:
%---
:- module bestmath.
:- interface.
:- import_module io.
:- pred main(io::di, io::uo) is det.
:- implementation.
:- import_module int.

main(!IO) :-
io.print_line(42 `add` (11 `sub` 11), !IO).

:- func add(int, int) = int.
add(A, B) = A + B.

:- func sub(int, int) = int.
sub(A, B) = A - B.
%---
all the convenience of operators without the ugliness of syntax!
and note the 'int' module import: Without that the +/- operators aren't even available for ints. So you could make your own module that only provides wordy math.
Lisp is technically also capable of all this, but ain't nobody got that many parens to spare.

Nanonymous No.5894 [D]
i agree with this

Nanonymous No.5899 [D][U][F]
File: 806fec14905a1ce3a19dce8ba0fb80da0a43c36a4adcb82c42286191dc25d4fc.jpg (dl) (19.72 KiB)
sheeit i didnt know nanochan was back. have a UNIX rant i tried to post before 8gag was kill

Nanonymous No.5900 [D][U][F] >>5901 >>5902 >>5942
File: 2bb9f1fc5c3fda0ff395cbd2549001b08268cd089dfb7b05c46214ab6569363b.png (dl) (105.96 KiB)
You niggers don't understand shit. You go around asking "if X is so good why isn't it used?". If you understood the structure of the software industry even in the least, you'd know that every decision is simply driven by money and other things that have absolutely nothing to do with software quality. Your average programmer gets to work at 9, does whatever the sheet says to do (which are a bunch of trivial tasks with _trivial benefits_[1]), thinks about how he's gonna pay his mortage/car/retirement, and hurries out the office at 5. He doesn't even have a _conception_ of alternate methodologies/techniques/tools, nor does he even understand his own methodology/techniques/tools.
Software quality is measured by evaluating and then set by emulating the status quo (and simply copying verbatim, where legally allowed. for example using the same dogshit meme library everyone else uses). Why the fuck do you think every god damn messenger program suddenly started looking like a Microsoft Word document combined with HTML from 2010 onwards? Why do you think they only started making "secure" messengers after Snowden? (Most of the ones that aren't complete shit existed before Snowden)

1. such as:
>Fix one (out of N) XSS vulnerability #3572835235 because someone made an CVE about it
>Finally decide to fix "all" XSS vulnerabilities because company is looking bad because got a few CVEs in one week
>A month later, a CSRF vuln is found
>6 months later, 5 XSS vulns are found
>Add a test case to make sure bug #5723582 doesn't happen again because a manager ordered this so it looks like he's doing something about bug #5723582
>Some dependency went on protest, configure the project to use the new one (lol this one can only happen in UNIX braindamage land)
>If I drag an icon from my file browser into a text field, the app crashes. fixed!
>Add feature du jour (such as """emojis""" or variant #523523 of it, or some faggot's face with a circle around it)
>Write blog article starting with "Here at niggerfaggot inc" and post it to HN
>It's all going Flat UI now. redesign UI (an "epic")
>It's all going web now. move entire program into a web app ("epic")
>Fix race condition #572835 but introduce new bugs at the same time
>The security-marketing department and/or NSA wants us to add {2FA,password hashing variant #52352,phone verification,IP address location based detection of "compromised" account...}

Nanonymous No.5901 [D]
>>5900
this guy gets it

Nanonymous No.5902 [D][U][F]
File: e77b018a56bdba07124c011be00cc987923559ff0ee10e7b59b7f32818cb9553.jpg (dl) (1.64 MiB)
>>5900
>average programmer gets to work
>work
==He who does not work, neither shall he eat=
Based Paul strikes again, checkmate, anarchist!

Nanonymous No.5942 [D]
>>5900
It's what Open Source and Free Software is there for, in theory. And only in theory, because the good ones are often BSDd and bought out by a company that does the same shit you described.

Nanonymous No.6048 [D] >>6049
>>5673
>unix niggers choosen C as their language because it is unreadable and they can pretend to be evil dark hackers
C was invented specifically to write the Unix operation system.
C is only successful because Unix was successful.
Now all major operating systems are written in C it's way too late to burn everything down and rewrite in something else.
You're welcome to try though.

>same like when they claim that CLI is better than GUI
You can't script a GUI. GUIs are only good for exploratory, casual usage; when you don't know what you're doing basically.

>hacker/unix culture = dumb shit niggers that want to show off that they memorized some cryptic magic commands
With man pages and autocomplete you don't really have to memorize anything.

Nanonymous No.6049 [D]
>>6048
>You can't script a GUI.
Of course you can. it's just that most GUI programmers don't bother adding scripting support. You can also make a GUI for a CLI program, achieving both GUI and CLI capabilities.