Ok, someone needs to explain how you weird, non 1-10 scale grading system works...
It went something like this:
Back in the latter half of the 20th century, somebody said "hey, those with High School degrees do better in life than those without. Let's make sure
everybody can get one of those!" For a while, that was good, as they pushed resources into seeing why some were failing via causes like malnutrition, bad home lives, etc, and they worked to improve the learning capabilities of people so that more
actually gained the knowledge to graduate High School.
Then the "But
everyone should have a High School diploma any way possible! Look how good people do with one!" squad started up. They forgot that it had value for TWO reasons. The first is that it represented that you could actually gain a certain amount of knowledge, and some of that was quite useful in life (math, reading, communication skills, etc) to a relatively set standard. Secondly,
it had value because not everybody had it! In order to achieve the goal of "everybody" having it, they had to lower standards to the point that some people graduating High School could barely read. And then
because the goal of "everybody" having it was achieved, it then was also worthless,
regardless of the curriculum changes that caused everybody to be able to get it, because you can't distinguish between people on an easy universal standard like that, and thus NOT having it is basically a disability, rather than having it a mark of pride and/or accomplishment.
And this is how we get to today, where even Undergraduate Degrees are losing value, because so many have them. When it was unique to have one, even a degree in Art History could get you a decent office job, since it "proved" you could do paper/desk work to get such a degree. But with so many graduates of university/college, those are also somewhat de-valued. Now the only "guarantee" of a job (with only an undergrad) is with a vocational degree, like Engineering, Nursing, or (sometimes) Teaching (often that's a Masters, and so doesn't apply the same way). And even in those, not always.
I'll give an example of that last point: I have an engineering degree, and there were
stories from apparently 10-15 years before I started how the traditional "introduction" to the Engineering faculty was "Look to your left, and look to your right. If you survive, those two people will be gone." It was meant to represent the ~50% flunk (transfer out, or whatever) rate of Engineering Students after the 1st year. But the university administrators didn't like that high rate, as it made them and/or the Engineering department look bad. But they still had to hit accreditation standards (thankfully), so what did they do? They moved the "weeder" course (a course designed to "weed out" those who aren't going to make it) into SECOND year. This is REALLY BAD because if you change majors in 1st year, you still can "probably" graduate in 4 years (some summer school maybe), 5 at most. But after 1.5 or 2? If you can't cut it, you may just flunk out entirely, rather than change, or (even worse) "stick it out" just barely. The further you are down a road, the harder it is to switch, and they made it
harder to switch, and so people just failed. Second-year dropout rates skyrocketed, and many didn't transfer, I mean just straight drop out. To my knowledge they haven't switched back to 1st-year-weeder yet, but I hope they have. But there's still a number of "marginal" engineers, who probably would have been better off switching out in 1st year, but were "stuck" later, and squeaked by. And that's not good.
TL; DR; Lowering standards bad.
(No hate intended on Art History grads. It's just a stereotypical target)