• 0 Posts
  • 17 Comments
Joined 2 years ago
cake
Cake day: June 14th, 2023

help-circle

  • I have been using Linux since the early 90s. I don’t know it all. I read man pages. I use -h or --help. I read the arch wiki. I read docs. I read source files and examples. Lots of reading. You will never know it all. There is too much information.

    You need to know how to find information. It can be tricky. Knowing how to ask the right questions often requires you to know a bit of the answer.

    Stumbling about trying to find answers is training the skills you need.

    I think it helps if you have a programming background and IT support experience. Not just because you will understand more concepts and terms but because you have already developed some of those skills but some people come from other backgrounds and pick things up really quickly because they have well developed research skills.


  • The only software I have paid for in the last couple of years are games. The licensing is still crooked but they are ephemeral entertainment so its not like they control your life.

    The problem with commercial software isn’t the price. It is the lock in. They have you by the balls whether you pirate or pay so I don’t pirate as it doesn’t address my main issue with closed source software which ismt price but control. I prefer to adapt, sometimes live with less features and use free and open source.

    Its hard if you have to work with others which is the whole network effect BS, everyone is on Reddit and shitter so why aren’t you. If you can work independently though you can get a lot done and have more control.


  • shirro@aussie.zonetoLinux@lemmy.mlLinux is religion
    link
    fedilink
    English
    arrow-up
    13
    ·
    edit-2
    10 days ago

    No but there is an ideological basis for free software though it is firmly based on practical experiences dealing with the consequences of close source devices.

    Red Hat and Ubuntu are business. Debian and Arch are communities. Some of the smaller distros are basically that one guy in Nebraska.

    People promote them for various reasons. An IBM employee will have different reasons to the supporter types who latch on to a distro and mascot like it was a football team. Now football, there is a religion. Its all ritual, nothing they do has any practical use, people congregate once a week and in some parts of the world it turns violent.

    When the deb users start committing genocide on the rpm users I’ll call it a religion. Until then its just a bunch of anime convention fans arguing about their favourite isekai.




  • There will always be the hoarders. You can’t collect everything on any sort of wage/salary and live. If you have a compulsion then piracy is a reasonably harmless past time. You aren’t depriving anyone of income for something no normal person could reasonable afford.

    For regular content consumers it is simply free market economics working. Companies innovate and offer great products and value and they take people away from the black market. Then the companies get greedy, form loose deniable cartels and start fixing prices prices at higher levels and cutting quality and consumers go elsewhere. They want the profits from a free market but don’t want to play the game and compete on value and quality. Sucks for them. The government grants them a legal monopoly on monetizing their IP but it doesn’t give them a clue on how to build successful businesses.


  • Honestly I don’t think piracy is great. I would rather pay a fair price for easily discoverable content, own it forever on all mediums, and have the bulk of the money go to the creatives who made it so they can pay their bills and feed their families. I don’t watch a lot and the little I do should be affordable. I don’t feel compelled to collect it all.

    But then I go to introduce one of my kids to all the age appropriate comics/graphic novels I bought on Comixology for an older sibling. But Comixology closed down and all the content moved into Kindle with a heap of all ages content, not all appropriate. And Amazon are too cheap to offer family sharing outside the US. So hundreds of dollars in Bezos pocket with no way to put it on a device for my kid. I could waste time breaking the DRM or pirate but I am leaning towards a return to dead trees.

    Netflix is crap and getting worse, jumped ship ages ago. Disney is killing genre movies and tv with all their marvel/star wars IP. Both are driven by algorithms and greed and recycle IP instead of taking risks.


  • shirro@aussie.zonetoLinux@lemmy.mlIs Linux on Android as secure..?
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    2 months ago

    There is no simple answer. Its is almost entirely dependent on implementation. All systems are vulnerable to things like supply chain attacks. We put a lot of trust in phone vendors, telcos and Google.

    If you are going to compare to something like termux you need to compare with an equivalent sandboxed environment on regular linux, like a docker/podman container with appropriate permissions. As far as I know they use the same linux kernel features like cgroups and namespaces under the hood.

    Traditionally Linux desktop apps run with the full permissions of the user and the X window system lets apps spy on each other which is less secure than Android sandboxing by design. There have been attempts to do better (eg flatpak/flatseal, wayland) but they are optional.


  • Having comprehensive unicode language coverage on a free OS is amazing. I wish the font system was smart enough to hide Noto variants in creative apps but leave them available for browsers. There is a workaround to do that but its a huge pain. I wouldn’t delete any files managed by the package system. They will just keep coming back anyway. There are smaller collections of noto fonts in AUR that will satisfy the noto-fonts dependency which should keep KDE Plasma happy. They should be a straight swap if you are comfortable with an AUR dependency for a functioning desktop. The newer one is noto-fonts-main updated this year or there is an older noto-fonts-lite. Not tried either. Usual stuff about backups and taking advice from strangers on the internet.

    Segoe might benefit more from the embedded bitmap or autohint settings than the regular open source fonts I am likely to use. Microsoft would optimise the hell out of it to take advantage of their proprietary, patented font rendering system. I wouldn’t be surprised if it rendered poorly with distro defaults. Its the kind of blind spot a lot of open source devs and packagers could easily have. Its probably packed full of embedded bitmaps for small sizes and proprietary hinting stuff that linux won’t understand.


  • I don’t doubt you. Linux font rendering has been good enough for so long now that its surprising when people say its worse than some other system but I think it is still a reasonably common complaint so there has to be something to it. A lot of distros probably don’t have a very good font selection installed out of the box compared with proprietary systems and I am sure that plays a role.

    My desktop has a 38" ultrawide and the pixel density is a lot lower than your dual 4k monitors so I want to do everything I can for font rendering and your post has got me asking questions. I am in the process of configuring a minimal, low distraction tiling wm setup for a bit of fun (also another nvim conf spring clean). I hadn’t considered changing the font rendering defaults.

    I think I have all the fonts you list installed except for Hack. Inter is also a good one for UI. I don’t use Fira Code anymore for code/terminal but I keep it around. It is a nice code font with ligature support but it didn’t have an Italic variant and I like subtle use of italics in code and docs. Currently using Iosevka for mono but next week it might be something different.


  • This post is fascinating. Most distros have good defaults for font rendering now and I haven’t used hacks like infinality to fix font rendering on Linux for years. That project doesn’t even exist anymore. I would be really interested to know which setting made the difference for OP and why.

    I am writing this on a little HiDPI laptop with over 200dpi and to be honest hinting and sub-pixel rendering are invisible to my eyes on this device. Apple dropped sub-pixel rendering ages ago when all their products moved to retina displays. But its still really useful on low dpi displays and I thought it generally worked well enough out of the box.

    A file almost identical to the local.conf has been posted to forums in the past but back then fontconfig often shipped with outdated defaults. My distro defaults have aliasing, slight hinting and sub-pixel rgb enabled out of the box.

    Arch has these defaults. Bookworm lacks the sub-pixel-rgb (its just a link away) but my guess is Ubuntu derivatives probably include it:

    • 10-hinting-slight.conf
    • 10-sub-pixel-rgb.conf
    • 10-yes-antialias.conf
    • 11-lcdfilter-default.conf

    The differences I see are the last 3 options in local.conf:

    • disabling embedded bitmaps. I think this would change rendering for old MS Office fonts. And perhaps break some emoji fonts. I have Noto Color Emoji but I don’t have any old MS fonts. This seems like it would have limited impact.
    • enabling autohinting. If you have slight hinting enabled and the font contains hinting information it should automatically use it. So I thought it made no difference if you have good fonts installed. I might be wrong. But again if you use good fonts I am not sure this has an impact.
    • setting font weight to medium. This is an odd one. Does this mean that every font query returns a medium weight or that if you don’t give a weight you get medium? Fattening up thin fonts might be a user preference but you can also select desired font weights in your desktop settings and apps.

    Fontconfig is a compiled database for font queries, it doesn’t do rendering. Whatever you put in fontconfig, an app like kitty will not implement sub-pixel rgba rendering for performance and implementation reasons but many other terminals will. I think gtk4 might be heading the same way. Depending on variations in colour vision and displays people tend to disagree on the value of sub-pixel rgb but it looks like it is a common distro default anyway.


  • The local.conf file should work on any distro. It’s an opinionated override and might not be ideal for everyone but you can use the settings as a starting point to research further. Don’t modify the other files in /etc/fonts as they will be updated by the distro. Claude’s other suggestions apart from selecting some better fonts generally do nothing as far as I can tell. I connected to one of my debian machines and the symbolic links Claude gave to /etc/conf.avail are a debianism as I suspected. If you don’t install or use bitmap fonts and you override the rgb aliasing in local.conf I don’t see the point of either of those symbolic links but whatever meatbag wrote them in a stackoverflow or reddit post intended them for a debian distro.


  • Freetype2, fontconfig and cairo are going to be pulled in as dependencies when you install just about any desktop app/library eg firefox, gtk so this is a no-op. Same for installing the jre and fontconfig again. Its pointless. The freetype2.sh line is commented out because that has been the default setting for the last 8 years so it makes no difference. The gfx.webrender.all setting in firefox is an override to force something it is most likely already doing based on the detected environment. If you check about:support the chances are you are already using hardware rendering. And its a performance and not quality setting. Half of this makes no sense.

    Installing nicer fonts is always a good idea and also setting your desktop and application default fonts.

    Some of the local.conf settings could potentially makes a big difference if your desktop environment defaults/user settings aren’t good. Don’t know a huge amount about freetype settings but I suspect using assign in there might override desktop environment settings which some people might not want. I set mine in a gui like the monkey I am.

    My conversation with any llm tends to go, “you got a, b, c wrong, it should be d, e and f” and it says “sorry, ofcourse it should be d, e and f, my mistake, here it is with d, e, f, g and h”. Then I say “g and h are wrong it should be i and j”. And it keeps going. In the end I write it myself. Huge time wasters.

    Edit: didn’t pick up on it immediately but the two symbolic link commands are suss (they are for debian based distros). Endeavor is arch based and fontconfig on arch has the configs in /usr/share/fontconfig and the ones in the conf.default directory should already be linked into /etc/fonts/conf.d. 10-sub-pixel-rgb is in /usr/share/fontconfig/conf.default so that is already linked for me so attempting to do another link without deleting it would be an error - another no-op. I don’t like rgb sub-pixel rendering so its overridden in my desktop settings. It shouldn’t be necessary on high dpi IMO. The proper path for 70-no-bitmaps is in the /usr/share/fontconfig/conf.avail directory if you want to link it properly. If you use the wrong path as Claude suggested, its another no-op. If, like me, you don’t have any bitmap fonts installed it won’t make any difference anyway. Also /etc/fonts/conf.d is created by the fontconfig package so that is another no-op.

    Edit 2: The setting’s name might be inaccurate but Cleartype is the name of Microsoft’s proprietary font renderer and isn’t available on Linux. So possibly gfx.font_rendering.cleartype_params.rendering_mode was picked up from some StackOverflow discussion about Firefox font rendering on Windows. I won’t say it doesn’t work without reading the Firefox source code and/or trying it but I suspect a setting with that name would not have any effect on Mac or Linux.


  • The expense of tools, equipment and supplies can be a huge barrier to car maintenance but there is so much legitimately free software for computers (even ignoring the pirated stuff) that people never had so much opportunity.

    If is like learning another language or a musical instrument, people have to be committed and practice to get good and few people can make the effort. Businesses have trained people to seek instant gratification from fast food, social media, tik tok, gambling, loot boxes, and consumerism in general because short lived and unfulfilling experiences produce an endless monetization opportunity. The rare people with the discipline and support to focus their efforts have massive advantages with access to information and tools which were very difficult in the past. There are some prodigies out there in a sea of mediocrity.



  • True that copyright always existed to protect publishers and not creators. But in pre-digital times there were considerable barriers to publishing and distributing creative works at scale so while publishers in all media have often abused creators they were a necessary evil if you wanted to make a living.

    The worst trick greedy capitalists have pulled recently it to bypass copyright and steal the entire digital record of human creative labor to incorporate into proprietary models and services for their own enrichment. I have no idea how society and our political representation has slept through that. The second worse is insanely destroying their own industry by fucking over both consumers and creatives with increasingly unsustainable greedy and dumb bullshit.

    Access to education and other equitable causes really should be fair use. If everyone pirated, and the way things are going it will be the only sane way to get content, then new content is going to dry up unless people are happy with AI slop. We will still see indie self-published works but necessarily the creators won’t have access to the same resources we saw when they were part of an exploitative but productive industry. That sucks. A lot of people are happy to pay for convenient and affordable access to content under reasonable conditions and piracy is something they only resort to when that is denied.