Computer Science: Theory and Application
First of all, sorry if this was already posted, I couldn't find any reference on this subreddit.
Looks like the long awaited TAOCP 4B, or more formally, "Combinatorial Algorithms, Part 2" will be available on 11 October, i.e. ten days from now! I'm quite excited, this will be one of the most interesting books in the series.
See Knuth's page on the books for reference.https://preview.redd.it/nmr43ysju5r91.png?width=1053&format=png&auto=webp&s=74463a2b37141b77656ed78c70b8a4defd2079de
Hey y'all, I want to talk about programming from the human perspective, how we think about code, and the benefits of introducing a paradigm that changes that.
My work sent me to a Microsoft conference recently where one particular session caught my interest. It was about "low code", like what is being used in some products like Microsoft Power Products. The main selling point for "low code" is obviously that it is easier to learn and use without a coding background. To us computer nerds, that sounds lame because it is just dumbing down the super powerful and awesome stuff we can already do perfectly well. This particular session was a panel discussion, and honestly, most of the panelists described it exactly that way. Most of them were concerned with how it can be used to help non-tech people make business solutions...
But there was one panelist who was talking as if he were my own internal monologue. He said that low code is going in the same direction that computation always has been. Low code makes you sacrifice a little bit of control and focus on higher-level things, kind of like how we don't care about memory locations and CPU registers when we write in Python or Java.
I caught him after the session to give him a cryptic warning about the future of low code. His vision aligns perfectly with mine, but not necessarily all of the current trends. A lot of low code is not really abstracting away complexity of modern languages, but is just taking the features of existing programming languages and removing a large portion of them, essentially just dumbing them down. No objects, no scoped variables, no functions, no parameters, no recursions, etc. Low code solutions remove some of the complexity of things like connecting to other sources or interfacing with other things, but the logic itself is just programming but stupider.
But what I think low code should be is a paradigm shift. If you compare Java to an Assembly language, sure, you're not worried about CPU registers anymore, but you're worried about higher constructs that are more powerful. Objects, functions, packages, libraries, etc. Low code should similarly enable you to work with higher level constructs and change how you, the human, solve the problems.
Anyway, I would love to start making my own language parser to explore some of these ideas. You probably don't know exactly how to make a new beautiful programming language with a fundamentally different paradigm, but in your experience, what are the variables? What things could you tinker with and turn upside down to maybe change programming forever? One great example in my mind is actually Excel. It does computation, but in a very different way than any actual programming language. It narrows the scope a lot, but it became a powerful and famous tool for things that would have previously required complex coding and a different, probably more complicated, problem-solving approach.
I would like to share with you this visualization for the formula
1^2+...+n^2=n(n+1)(2n+1)/6. I hope you will enjoy seeing the visual proof manifesting itself as an assembly of a 3D puzzle.
This video was prepared using the manim library.
I've been doing research into maximum cardinality matching for a project. Most sources talk about the Hopcroft-Karp algorithm. Some of them mention that the problem can be turned into a network flow problem and use the Ford–Fulkerson algorithm.
But both of these algorithms are old and slow. There's been 50 more years of research on the topic since these algorithms were published, and many faster algorithms for solving network flow have been found. Just a few months ago, a near-linear time algorithm was discovered.
So my question is, why is Hopcroft-Karp still recommended by so many places (geeksforgeeks, stack overflow, wikipedia, random programming blogs)? Is it because the faster algorithms are more complex? Or do the newer algorithms only work in special cases?
Ranked by Size