Please note that I explicitly said "past year 2". I believe the first two years were fairly decent, especially those who had worse basics than mine (not that mine were that good).Notes (because the reply might sound weird without them): I do live in an authoritharian pseudo-democratic country which I will not mention explicitly for reasons; I do think some parts of the education system are effectively brainwashing (primarily the way we study history), but it doesn't affect my statements; I do think (or at least hope) that this is an issue of my university specifically and not our education system as a whole; I do admit that I slightly exaggerated, and that technically speaking there was some useful material in more courses than one, but I do stand by the opinion that the way we were taught only makes few of them actually useful. Also, I use the word "teacher" as there is only one professor in our department, which should've been a weird sign from the beginning, but as this is a story from a local university, you might imagine I didn't have much choice thanks to circumstances of my life.
Now, the fun way to start this would be to mention the specifics first.
Our databases course (now extended into database administration) did have a little bit of theory, primarily normalization and transactions. However, in practice, most of the information we were actually allowed to use within assignments are something you'd learn within 10 minutes of learning SQL (so very basic operations). We were not taught or (effectively) allowed to use constraints until last semester (4 semesters into learning about databases), we did not have a single mention of joins or indices, we did not have any discussions of ACID. It took us a year to even get close to data integrity. We were not allowed to use primary keys until the last semester. A lot of the course (the entire first year of it, really) was focused either on FoxPro DBMS or the visual parts of Microsoft Access, so much so that our assignments required basically documenting our GUI navigation of the latter without letting us do actual database work. We did switch to Oracle SQL DB in semester 7, except that we are forced to work with Oracle APEX and working with raw queries was basically self-sabotage due to the description requirements for the assignment write-ups.
As an extension of the above, we received a frankly ridiculous amount of scrutiny for everything BUT the actual database management. I was once forced to spend 2 hours staring at a Word document in far commander (which the teacher was obsessed with) as Microsoft Word weirded out at me typing out a word and screwed up an error underline, which triggered the teacher so much he forced me to find a reason it happened, or I would have my grade decreased. The same person made us spend the entire first lab class of this semester writing up what Oracle as a company is and what other non-Oracle products are named Oracle. This person alone soured my higher education to a degree that nearly made me quit university.
Our mobile development course consisted of the person running it making us choose a topic and implement a mobile application and effectively screwing off for the rest of the semester. As a result of no guidance, I would argue this is about as useful as making us watch and follow a Flutter (or Compose or whatever) tutorial and making a 10k words long write-up based on that. Also, a single mixup of "phone" and "smartphone" was punishable by extra assignments, initially in a form of making us make a presentation on what's the difference between a telephone and a smartphone.
Our operating systems course was relatively decent, except for the fact that due to the reduced hour count for the program, the only practical things were: write up some Windows batch scripting commands; write up some bash/coreutils commands; launch some Windows utilities from the command line and screenshot the process. The lectures were decent though, even though it was just a fairly high level overview of OSes people use and not what an OS really is. Not having an assignment on multithreading was funny when we got one for oru Java course.
Our neural networks course had us solve a set of quizzes about neural networks. We had no lectures, despite having no proper introduction to what a neural network even is. The course was stolen from a paid one, which I know because, incidentally, after half-intentionally breaking the grading system of the LMS ours was running on (tldr Python ACE due to unescaped evaluation in code runner tasks, go figure), I was tasked with rewriting this same course in a hardened way. The only benefit of this one was that I got paid for it, though you could argue that forcing us to learn on our own was technically useful under the guise of "you need to learn to learn"
Our project management course's exam (or, well, pass/fail oral attestation?) had us talking about Windows COM, Waterfall architecture and manual testing. There was a single mention of unit testing. The course material also assumed that debuggers still could only debug 16-bit code.
Our pre-diploma course project on project management forced us to pre-plan the whole application we were going to write. Architecture, specific library structure, specific class hierarchies, specific classes, fields and methods. While actively forbidding us from writing code. People were also forced to write up about database structure even if their projects did not imply having one. All while we had no choice over our project topics, as those are supposed to be work we do for a company. My friend has a 9k word long write-up about an S3 cache microservice, as that's the only way to pass the requirements.
These are just some parts of the torment we've seen here, as I am only listing out things from years 3+4, ignoring years 1+2 (which had their problems, making a C++ per-symbol parser was a fun one). The history of bias and straight up bullying from the tutors is long, documented and not acted on. The only reason corruption isn't openly involved is that one of the people teaching here was sued for taking a bribe about a year before I got into the university.
On top of all this, we've not learned anything about actual system design, security, distributed systems, functional programming, Linux, ethics, embedded, performance engineering. Our parallel programming class was just a set of questions in a quiz about OpenMP without an actual introduction. Our graphics class was us making models in Blender and p much nothing else. Our web development course forced us to write everything in Notepad in pure HTML4, and using JS was punishable. Our OOP class was overfocused on C++ so much that we've got `std::function` as an exam question because "well, it's a callable object, who cares that it's actually used for HOFs". Anything related to deployments and DevOps was only mentioned thanks to the fact that one PhD student was forced to run a subject that was meant to be entirely about Windows Active Directory and made a proper course of "from zero to CRUD in production" instead, which was arguably the most useful course in those 4 years for the majority of my peers, as it actually forced people to learn about CRUD workflows, frontend, REST API design and Docker.
I strongly believe that the way we were taught things related to most subjects actively harmed students, as we were not allowed to do our research and use results thereof, with the lecture material being either mostly useless or grossly outdated and out of touch with reality, even though the subject structure is pretty good and seemingly on par with normal universities.