• 0 Posts
  • 26 Comments
Joined 1Y ago
cake
Cake day: Nov 12, 2024

help-circle
rss

Modern fonts have extra stuff to make rendering better.
Like hinting, which changes subpixel representations.

Without those, you wouldn’t like the look of something like a character with a height of 10px on a 1080p display and would have to use way higher DPI stuff, with characters taking more pixels.
Won’t be unusable though. Automatically done anti-aliasing tends to be good enough too.


Yeah, my main point with all those examples was to put the point that “AI” always has been a marketing term.

Curve-fitting and data-point clustering are both pretty efficient if used for the thing they are made for. But if you then start brute-forcing multiple nodes of the same thing just to get a semblance of something else, that is otherwise not what it is made for, of course you will end up using a lot of energy.


We humans have it pretty hard. Our brain is pretty illogical. We then generate multiple layers of abstractions make a world view, trying to match the world we live in. Over those multiple layers, comes a semblance of logic.
Then we make machine.

We make machines to be inherently logical and that makes it better at logical operations than us humans. Hence calculators.
Now someone comes and says - let’s make an abstraction layer on top of the machine to represent illogical behaviour (kinda like our brains).
(┛`Д´)┛彡┻━┻

And then on top of that, they want that illogical abstract machine to itself create abstractions inside it to be able to first mimic human output and then further to do logical stuff. All of that, just so one can mindlessly feed data into it to “train” it, instead of think themselves and feed it proper logic.

This is like saying they want to install an OS on browser WASM and then install a web browser inside that OS, to do the same thing that they would have otherwise done with the original browser.

In the monkeys analogy, you can add that the monkeys are a simulation on a computer.


They were technically Expert Systems.
AI was was the Marketing Term even then.

Now they are LLMs and AI is still the marketing term.


If something uses a lot of if else statements to do stuff like become a “COM” player in a game, it is called an Expert System.
That is what is essentially in game “AI” used to be. That was not an LLM.

Stuff like clazy and clang-tidy are neither ML nor LLM.
They don’t rely on curve fitting or mindless grouping of data-points.
Parameters in them are decided, based on the programming language specification and tokenisation is done directly using the features of the language. How the tokens are used, is also determined by hard logic, rather than fuzzy logic and that is why, the resultant options you get in the completion list, end up being valid syntax for said language.


Now if you are using Cursor for code completion, of course that is AI.
It is not programmed using features of the language, but iterated until it produces output that matches what would match the features of the language.

It is like putting a billion monkeys in front of a typewriter and then selecting one that make something Shakespeare-ish, then killing off all the others. Then cloning the selected one and rinse and repeat.

And that is why it takes a stupendously disproportionate amount of energy, time and money to train something that gives an output that could otherwise be easily done better using a simple bash script.


I don’t consider clang tools to be AI.

They parse the code logically and don’t do blind pattern matching and curve fitting.
The rules they use are properly defined in code.

If that was AI, then all compilers made with LLVM would be AI.



I made sure to get one with 4 slots (still haven’t managed to use them all though) and good component cooling.
Thinking back, getting the G version would have done better, but I didn’t intend on swapping with a 5800X back then.

Now it’s just about getting the right parts to accompany the CPU and RAM I have lying around. And the HDDs of course. The Toshiba one ended up having an undesirable delay, meaning I would have to go with the Seagate ones priced 50% higher.

And then a router not having a backdoor.
Hopefully I can find one with OpenWRT without having to choose ASUS.


The spare one is a 5600X which was previously in the same motherboard before I switched to a 5800X.

I am considering getting a B550 motherboard once I have the funds, but the number of SATA ports is kinda low. My current one is X570 with 6 ports.

I am not planning on RAID for now, as I don’t really plan on ever having enough for redundancy, but I definitely want to be able to use the system for any and every task I can program to control via a local SSH. So definitely going with a full distro, like Debian or sth, and not a NAS specific thingy.
I can even make use of my old Radeon 4650 for a display while setting it up.


I was looking more towards something that can use my spare CPU and have enough SATA ports to last a long time.
And while USB to SATA is expected to be inherently unreliable, all PCIe SATA devices I see seem to be problematic in their own right[1] (from the reviews). The only one that seemed fine was the PCIe SCSI device, for which, I would be very careful, making sure I don’t get a fake.

Then the available motherboards having >6 SATA ports seem to all be high end ones, which doesn’t make sense considering I am trying to save some money.


  1. doesn’t make sense to have a SATA adapter that goes around corrupting the data in a way that it is hard to detect ↩︎


Yeah, I get it.
But what if most motherboards were just expected to have 2 slots, you know?
Of course I won’t be switching to Threadripper just for getting 2 CPUs on a single motherboard and have kept the other CPU lying around, thinking of using it for keeping all the storage HDDs, and maybe offloading all re-encoding stuff onto it.

For compilation though, I’m fine with just using my main PC, although I did look into distributed computing options, specially when using my laptop, I think I’m fine for now.


It’s more like a fleeting though that came to me a few times, when I felt like playing a game while reencoding, but was unable to properly set the CPU usage for the encoders in ffmpeg.


There’s also a Q8400 btw.
That’s what is lying around now the the motherboard is not working.


I wish dual CPU motherboards were mainstream.
I could then use the one I am keeping aside, during compilation/encoding tasks.

But my current computer definitely come above on everything other then the VRAM.


When I was a child and first saw a 3d game, I imagined the lighting to be done by ray-tracing (without the actual name of it, of course).
Until then, I only knew 2d games with no lighting mechanics and just a bunch of pixels for sprites.


Well, if you are adding my 15yo Core2Quad in the percentage, of course those numbers come easy.


“can be” ⇏ “has to be”

And it’s not fiction that sets high standards, but the people watching it, that are doing so.

Now you may say that the people are setting those standards only because they are watching said stuff.
But that is just rephrasing, “the people watching fiction are incapable of having their own imagination”.

Back in school, I had a classmate that had a much greater height than others, due to steroid usage.
Now if you say that his parents did that because they watched “JoJo’s Bizarre Adventure”, I’ll say it was not released yet and I have no reason to believe that they bought comic strips from another country and went ahead and made a ‘gag’ piece a basis for their standards.


If it is the system launcher and that permission was provided automatically, all you need to do is use another launcher.

I am using stock android on a smartphone that was in the Android One program. I have not changed the the launcher.
My expectation when keeping the Google first party screen lock thing is for it to not make it easier for me to make the mistake of leaving the phone unlocked.

At this point, any app installed on a certified Android device in these regions must be registered by a verified developer.

And that will mean that if I were to feel like making my own app for the smallest of things and just install it on my own phone, I need to tell Google: “Hey! I am programming for Android!” as if they don’t already have enough of my data.
And then sideloading would probably require signing it with a certificate, so Google will always know that I made a software and installed it on my phone.

There is no PR stunt here

Yes. There is no PR “stunt” here. Not everything that includes PR is a stunt.

Using the phrase “PR flavour text” refers to whatever PR is saying to make the actions of a company, seem less controversial. And that is the main job of a company’s PR department.
In this case, it is:
‘This change aims to reduce malware and scams associated with unverified apps, as sideloaded apps are significantly more likely to contain malicious software’

And yes, that thing is a lie as you already explained. That is why I call it PR flavour text.


2027 and beyond: We will continue to roll out these requirements globally.

This just gives me a deadline for switching to a Linux phone.
Seems to have come earlier than what I thought I would be able to manage, but I will have to manage somehow.


What’s their PR flavour text for that?

And how does a company get the authority to do a “ban”? Isn’t that supposed to be a Government thing?

Seems like their real goal is to make the users of their devices as vulnerable as possible. How?

  • First they remove the ability for other apps to record phone conversations, so we can’t use call recorder apps
  • Then, in the recording feature of their own app, they don’t record the part where it says, “This call is being recorded”, making it possible for anyone to claim that they had not been informed of the call being recorded.
  • While the phone goes screen-off and lock in 15 seconds (that’s the timer I set) when doing something useful, like reading stuff on a website, if I leave it on the home screen, it seems to randomly decide sometimes that it doesn’t want to turn off (last I checked, I waited for > 2minutes before I pressed the off button myself)
    • and then using the off button too much, will make it go bad faster, which will then not have a replacement.
    • they will definitely call this a “bug” if the matter falls out of their hand. But until then, they will keep on denying it. Do they even read bug reports?

But then that lets people socialize using the game without the company being able to harvest their data.


Neither did Playstation from what I remember


It would make a lot of sense to the company trying to decide how large their production run should be.

For the customer, it only really makes sense if they are getting something out of it, like immunity to possible price hikes at launch.

I don’t pre-order, but then, I am a late stage buyer, so it doesn’t really apply to me.


Ah sh✫t got me!

You managed to tick the buzz word without having to make a native app for my computer.


I normally like GitLab issues as a place for bug reports.

A FAQ and an old style forum works pretty well for help.
In fact, just make a community on Lemmy for the forum part and you’ll have what’s required.

GitHub also has this new “Discussions” thing which should do some good, for those that want to stay on GitHub




So are you guys saying you don’t sink into your bed when you are about to sleep and then emerge up to levitation as you enter deep sleep?