• 2 Posts
  • 29 Comments
Joined 1 year ago
cake
Cake day: June 17th, 2024

help-circle
  • generally yes. but we are talking about a public network facing device that is usually the first line of defense against wide Net.

    that needs to be updated for new threats. those threats are not as extensive as 20 years ago (a lot of stuff are way better) but there are still bugs that appear in router as seen by news about routers hack that sometimes pop up.


  • openwrt uses linux kernel that is very near latest (LTS) release. they kinda have to do this for support added for new devices and new wifi standard and so on.

    a company that supports its own limited product range doesn’t need newest kernel that much. because contrary to popular belief most kernel changes are not security related. and their devices don’t change hardware wise.

    but having said all of that if I were you and my device was supported by openwrt, I would probably migrate to openwrt and be free of a small company limited support.







  • I remember that extension but I didn’t user it because it didn’t cleanup after it self (old not needed images stayed in cache).

    what is “your phone”? you mean an app? I know about text caching (I don’t know if frehsrss has an option to get original page for RSSs that has just a simple text that redirects to full page), but even inoreader that had that (if i remember correctly) didn’t have image caching.





  • so they removed other but allowed one to stay? wow. so much democracy.

    and if they vote for a party aligned with russia it is “propaganda” but if they vote for parties that say russian are orcs and west is the best is because of vast pristine primal knowledge that is so pure you can see through it. got it.

    I have no idea if those “pro-russia” parties are good or not. but if their only fault that you found is that they think maybe russian is eternal enemy number one (which is the standard gauge in the west) then they are ok.

    every party has different views on issues (if they didn’t they would just join others), so saying a party is banned because they are pro another country is not democracy (per my understanding of it of course).

    but it is their country and they are free to do what they like, just be sure to not call it “democratic” elections.




  • oh dont get me wrong. as I said I agree with most of your original (and now second post).

    my gripe with grain was not about av1 per se. it was with movie makers that add it just because they think it is how movies should be

    this is retarded to me: “Reasons to Keep Film Grain On: Artistic Effect: Film grain can add a nostalgic or artistic quality to video and photography, evoking a classic film look” because the reason is just “nostalgic” that the director has, as in if he was born after digital era, he would have an issue with it and not add it (usually).

    about h264 and transparency, the issue is not that h264 can get that but at high bitrate, the issue is that av1 (as I read) can’t get it at any bitrate.

    but overall I agree with you.

    I even recently was shocked to see how much faster av1 encoding has gotten. I would have thought it was still orders of magnitude, but with some setting (like x265 slow setting) av1 is has the same encoding speed.


  • I want to agree with you and I do to a large extend. I like new codecs and having more opensourcy coded is better than using a codec that has many patents. long term patents(current situation) slows technological progress.

    what I don’t agree with you is some details.

    first, Netflix youtube and so on need low bitrate and they (specially google/youtube) don’t care that much about quality. google youtube video are really bit starved for their resolutions. netflix is a bit better.

    second, many people when they discuss codecs they are referring to a different use case for them. they are talking about archiving. as in, the best quality codec at a same size. so they compare original (raw video, no lossy codec used) with encoded ones. their conclusion is that av1 is great for size reduction, but cant beat h264 for fidelity at any size. I think that h264 has a placebo or transparent profile but av1 doesn’t.

    so when I download a fi…I mean a linux ISO from torrents, I usually go for newest codec. but recently I don’t go for the smallest size because it takes away from details in the picture.

    but if I want to archive a movie (that I like a lot, which is rare) I get the bigger h264 (or if uhd blueray h265).

    third: a lot of people’s idea of codec quality is formed based on downloading or streaming other people’s encoded videos and they themself don’t compare the quality (as they don’t have time or a good raw video to compare).

    4th: I have heard av1 has issues with film grain, as in it removes them. film grain is an artifact of physical films (non-digital) that unfortunately many directors try (or used to) to duplicate because they grew up watching movies on films and think that movies should be like so they add them in in post production. even though it is literally a defect and even human eyes doesn’t duplicate it so it is not even natural. but this still is a bug of av1 (if I read correctly) because codec should go for high fidelity and not high smoothness.


  • you didn’t do the wrong thing.

    what many people don’t notice is that support for a codec in gpu(in hardware) is two part. one is decoding and one is encoding.

    for quality video nobody does hardware encoding (at least not on consumer systems linux this 3050 nvidia)

    for most users the important this is hardware support for decoding so that they can watch their 4k movie with no issue.

    so you are in the clear.

    you can watch av1 right now and when av2 becomes popular enough to be used in at least 4 years from now.


  • maybe, maybe not.

    when h264 was introduced (Aug 2004), even intel had HW encoding for it with sandybridge in 2011. nvidia had at 2012

    so less than 7 years.

    av1 was first introduced 7 years ago and for at least two years android TVs require HW decoding for it.

    And AMD rdna2 had the same 4 years ago.

    so from introduction to hardware decoding it took 3 years.

    I have no idea why 10 years is thrown around.

    and av1 had to compete with h264 and h265 both. ( they had to decide if it was worth implementing it)