Welcome Guest ( Log In | Register )

454 Pages V « < 362 363 364 365 366 > »   
Reply to this topicStart new topic
> What is the last thing you thought?, Tech Edition

 
post Mar 7 2022, 21:27
Post #7261
cate_chan



Technekololigy Enthusiast
****
Group: Members
Posts: 406
Joined: 4-May 18
Level 114 (Ascended)


QUOTE(dragontamer8740 @ Mar 6 2022, 21:57) *

Finally added a fan to my router after it overheated again while torrenting last night. Mounted with panel-mount RCA jacks and cotton swabs.
was that that one model of wrt1900 that doesnt have the built in fan?
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 8 2022, 05:29
Post #7262
Moonlight Rambler



Let's dance.
*********
Group: Gold Star Club
Posts: 6,487
Joined: 22-August 12
Level 373 (Dovahkiin)


QUOTE(cate_chan @ Mar 7 2022, 19:27) *

was that that one model of wrt1900 that doesnt have the built in fan?

One of them, yes.
Only the original release version (with the mini PCIe slots) had the fan, as far as I can tell.
This is a WRT1900ACS v2.
In my experience, even a miniscule amount of air flow works wonders for overheating things.
Do you have a 1900AC?

Just recalibrated my laptop using spectral samples from someone else with a different model with a Boe Hydis AFFS screen. It's improved (but not perfected) the accuracy. Things look significantly better, although still a tiny bit off compared to my calibrated powerbook and desktop. Calibrating and profiling takes a while, so I think I'll try with another Hydis spectrometer sample file when I go to sleep.

I also edited my FVWM config so that wallpapers automatically get whatever my color profile du jour is applied (corrected with) when selected.

This post has been edited by dragontamer8740: Mar 8 2022, 05:47
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 8 2022, 20:50
Post #7263
cate_chan



Technekololigy Enthusiast
****
Group: Members
Posts: 406
Joined: 4-May 18
Level 114 (Ascended)


QUOTE(dragontamer8740 @ Mar 8 2022, 05:29) *

One of them, yes.
Only the original release version (with the mini PCIe slots) had the fan, as far as I can tell.
This is a WRT1900ACS v2.
In my experience, even a miniscule amount of air flow works wonders for overheating things.
Do you have a 1900AC?
huh, completely misremembered, also have an ACS, think its a V1 though. for some reason I thought they added a fan later on, but it seems they took it away.
never ran into much overheating (yet) with mine but I guess I'll consider it now when I get around to making a proper server rack to also give that a fan and place with good airflow
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 9 2022, 08:43
Post #7264
trmtrololo



Casual Poster
***
Group: Members
Posts: 152
Joined: 12-October 19
Level 239 (Lord)


After almost a decade with this computer it is going to be time to update soon.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 10 2022, 10:47
Post #7265
Moonlight Rambler



Let's dance.
*********
Group: Gold Star Club
Posts: 6,487
Joined: 22-August 12
Level 373 (Dovahkiin)


QUOTE(trmtrololo @ Mar 9 2022, 06:43) *
After almost a decade with this computer it is going to be time to update soon.
Why, though?
My newest computer is a decade old this year. It works well. My laptop is even older and more frequently used. Interested to hear your reasoning.

Thought: I just was reading up on the privatisation of Australia's telecom services. Wondering how people from Aus on here feel about it.

This post has been edited by dragontamer8740: Mar 10 2022, 10:48
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 12 2022, 14:06
Post #7266
EsotericSatire



Look, Fat.
***********
Group: Catgirl Camarilla
Posts: 12,740
Joined: 31-July 10
Level 500 (Ponyslayer)


I was watching a video about a person that finds the location of scammers.

Damn, any comp with a wifi chip can be located within 20m.

Any desktop connected to a router with an access portal that does not need login can be located easier.

Google has made it far to easy to locate people, even when using VPNs.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 12 2022, 19:44
Post #7267
Moonlight Rambler



Let's dance.
*********
Group: Gold Star Club
Posts: 6,487
Joined: 22-August 12
Level 373 (Dovahkiin)


Twitter is such an awful platform for artists.

We need someplace like Pixiv that isn't Japanese and therefore bound to enforce censorship laws. Like if newgrounds gave you full resolutions for all images in posts with more than one.

Nothing has really filled the crater left by Tumblr's demise.

Edit: just implemented automatic color correction for whenever I set desktop backgrounds (applies a 3D lookup table to the image and makes the resulting image the wallpaper).

This post has been edited by dragontamer8740: Mar 12 2022, 22:26
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 13 2022, 09:07
Post #7268
Scumbini



C O C K INJURED
******
Group: Gold Star Club
Posts: 912
Joined: 2-December 15
Level 461 (Dovahkiin)


QUOTE(dragontamer8740 @ Mar 12 2022, 19:44) *

Twitter is such an awful platform for artists.

fixt
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 14 2022, 02:41
Post #7269
EsotericSatire



Look, Fat.
***********
Group: Catgirl Camarilla
Posts: 12,740
Joined: 31-July 10
Level 500 (Ponyslayer)


If duck duck go is now censoring results. Its almost like all tech companies are in the same club.




edit: 4K the color looks better than Blu-ray. For the quality of the film it depends whether they are going back to source material or working from the blu-ray material.

Some blu-rays have more detail due to aggressive sharpening and de-noising algorithms.

On the 4K you can notice when it switches between an OG source and the super sharpened / smoothed denoised scenes. Enhancements seen to be on a per scene basis where some blurays just have processing and filters thrown at everything like when you see purple and pink tint everywhere

I was reading that have noticed the scene by scene differences. Each scene looks the best that they could manage but it can be very noticeable for some films.

For anime, the main advantage of 4k... is color.




So 4K to be worthwhile needs not just a 4k TV but a higher end TV that can represent the color.

This post has been edited by EsotericSatire: Mar 14 2022, 05:15
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 14 2022, 09:26
Post #7270
Moonlight Rambler



Let's dance.
*********
Group: Gold Star Club
Posts: 6,487
Joined: 22-August 12
Level 373 (Dovahkiin)


QUOTE(Scumbini @ Mar 13 2022, 07:07) *
fixt
Can't believe I overlooked that. Thanks.

QUOTE(EsotericSatire @ Mar 14 2022, 00:41) *
edit: 4K the color looks better than Blu-ray. For the quality of the film it depends whether they are going back to source material or working from the blu-ray material.
Are you talking about chroma subsampling? Because that's the only reason 4K would look better on an identical screen, unless they use different colorspaces entirely.

QUOTE
Some blu-rays have more detail due to aggressive sharpening and de-noising algorithms.

On the 4K you can notice when it switches between an OG source and the super sharpened / smoothed denoised scenes. Enhancements seen to be on a per scene basis where some blurays just have processing and filters thrown at everything like when you see purple and pink tint everywhere
I can show you an example of this on a blu-ray.
[i.imgur.com] (IMG:[i.imgur.com] https://i.imgur.com/RqTAJMil.png)
[i.imgur.com] (IMG:[i.imgur.com] https://i.imgur.com/v0f7WfEl.png)

A 4K source being downscaled for a 1080p screen will have the same good colors as on a 4K set, assuming both displays have the same gamut, white point, black point, and bit depth, and if YUV is involved the same method for translating back to RGB.

What you're doing is comparing a 1990 CD master to a 2005 CD master, or an MP3 encoded in 96kbps done by a good encoder in 1999 vs. an opus file in 96kbps done by a good encoder in 2019. They aren't really comparable. They may share a bit depth, but bitrates and encoding techniques vary. Neithers' flaws come from them being sampled at 48KHz (or 44.1KHz), though.

So it's the secondary things that got improved on with the opportunity to break backward compatibility, rather than the resolution in and of itself.

So 4K is still unnecessary, but what you are identifying is a limitation of a particular (15+ year old) method of 1080p video encoding. A 4K rip downscaled with a good resampler will look just as great on a 50" 1080p set from a reasonable distance.

Chromatic subsampling IS a problem that (even if it still happens at the same level) 4K would help with, but it would be far less wasteful to do 4:4:4 encodes in 1080p than it is to do 4:2:0 encodes in 4K for similar effective quality. The things that most hold back good color are bit depth and gamut, with bit depth being by far the most crucial thing that needs upgrading. Color accuracy would also be welcomed, but average people would have to realize their shit sucks before colorimeters ever go down in price.

Oh yeah, nvidia won't give you 10 bit video unless you pay through the nose for it. Another reason to avoid their GPU's.

QUOTE(EsotericSatire @ Mar 14 2022, 00:41) *
For anime, the main advantage of 4k... is color.

So 4K to be worthwhile needs not just a 4k TV but a higher end TV that can represent the color.
That's mostly right, but somewhat misleading.
It needs to be a higher end TV that not only can display the color, but also doesn't have a gamut so wide that calibrating/correcting it for 10-bit video will not cause banding like super wide gamut ones do, and also is being regularly reprofiled if it's an OLED which will suffer from white point and white level drift as the blue (and others) degrade.

It does not have to be 4K; a 1080p TV playing back downscaled 4K source material can look just as good at normal viewing distances.

In other words, color volume (like three dimensional volume) is as important as color representation area (gamut) is.

Side-tangent:

Also, for good anime (mainly stuff from two decades ago or more, with a few later exceptions), there's so many problematic elements at play (for film ones, there are different telecine methods (old school and scanner-based), different film pigment formulations/emission spectra, projector light spectrum, condition of the film overall, including the endemic problems of Eastmancolor (see [color fading writeup at US National Archives](https://unwritten-record.blogs.archives.gov/2016/01/11/film-preservation-101-why-are-old-films-sometimes-pink/)).

For media that was only preserved on tapes, there's problems like the color broadcast system, country, if the tape is an early generation or had to be rescued from a later one, [dot crawl](https://en.wikipedia.org/wiki/Dot_crawl), [hanover bars](https://en.wikipedia.org/wiki/Hanover_bars), NTSC in particular had a problem with color burst phase drift, especially before semiconductor-based recording gear became common.

Most of this stuff can be fixed, but all the fixes have some tradeoff for visual quality. And 'accuracy' is up for debate with film especially because a color grader basically can only guess what the correct colors are, unless the film stock has reference images printed on it (similar to TV test patterns) that rate of decay can be determined from.

Relatively excellent source media:
[i.imgur.com] (IMG:[i.imgur.com] https://i.imgur.com/nW380x6g.png)

Relatively crap source media:
[i.imgur.com] (IMG:[i.imgur.com] https://i.imgur.com/dwiwmsdg.png)

Very crap source media (Fuck you, Toei Animation):
[i.imgur.com] (IMG:[i.imgur.com] https://i.imgur.com/cNscZiLg.png)

Production quality/budget determines a lot, too, but even low budget stuff usually had nice background gradients and stuff.

Also, funnily enough, even cheaper CRT TV's still have a better color gamut/volume than most modern consumer sets have; largely on account of being analogue (and also the standard color CRT phosphors). If you go to "wide gamut" sets you can beat them there, but you won't beat them for volume.

This post has been edited by dragontamer8740: Mar 14 2022, 10:08
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 14 2022, 12:56
Post #7271
EsotericSatire



Look, Fat.
***********
Group: Catgirl Camarilla
Posts: 12,740
Joined: 31-July 10
Level 500 (Ponyslayer)


QUOTE(dragontamer8740 @ Mar 13 2022, 21:26) *

Can't believe I overlooked that. Thanks.

Are you talking about chroma subsampling? Because that's the only reason 4K would look better on an identical screen, unless they use different colorspaces entirely.



Yes, they are using REC.2020, for a wider color gamut, 10 bit color, enhanced sub-sampling and support HDR10 and Doly vision. Its most likely artificial enhancement on the older content to boost the color space.

The thing I can't understand is why they say the standard supports full bandwidth chroma subsampling but then they just use 4:2:0 anyway. It seems they leave it to the player / TV to convert back to 4:4:4 if its supported.

Seems to be a limitation of HDMI 2.0.


Looking at another source, to get 4.4.4 chroma subsampling you need the whole chain to be 18 Gbps compatible, the UHD in (4:4:4 or raw RGB), the player, the cable, and chip in the TV all correct. No wonder I had some much trouble going through HDMI cables to get ones that worked properly with my setup to get 'enhanced 18gbps' mode to work.

What is more confusing, unless you are getting flagship tvs, most of them are only aiming for 90% of the P3 color space, which kinda makes UHD feel a bit more pointless.

I was reading one article that said that it may be 10 years before UHD is worthwhile as most consumers do not understand or care about color space and chroma subsampling.


QUOTE(dragontamer8740 @ Mar 13 2022, 21:26) *

So it's the secondary things that got improved on with the opportunity to break backward compatibility, rather than the resolution in and of itself.

So 4K is still unnecessary, but what you are identifying is a limitation of a particular (15+ year old) method of 1080p video encoding. A 4K rip downscaled with a good resampler will look just as great on a 50" 1080p set from a reasonable distance.


Is that if its ripped to a modern container? Blurays themselves seemed to have a host of issues.

QUOTE(dragontamer8740 @ Mar 13 2022, 21:26) *

Chromatic subsampling IS a problem that (even if it still happens at the same level) 4K would help with, but it would be far less wasteful to do 4:4:4 encodes in 1080p than it is to do 4:2:0 encodes in 4K for similar effective quality. The things that most hold back good color are bit depth and gamut, with bit depth being by far the most crucial thing that needs upgrading. Color accuracy would also be welcomed, but average people would have to realize their shit sucks before colorimeters ever go down in price.


Well everything looks boring in my TV's calibrated mode, so I assume the enhanced color presentation is not necessarily accurate.



QUOTE(dragontamer8740 @ Mar 13 2022, 21:26) *

Oh yeah, nvidia won't give you 10 bit video unless you pay through the nose for it. Another reason to avoid their GPU's.


I was kind of confused by this. I thought Nvidia supported 10bit video now? Are they still dithering or some BS.

I checked the settings on my comp, seems locked to 8bit. (IMG:[invalid] style_emoticons/default/ohmy.gif) wtf.



QUOTE(dragontamer8740 @ Mar 13 2022, 21:26) *

That's mostly right, but somewhat misleading.
It needs to be a higher end TV that not only can display the color, but also doesn't have a gamut so wide that calibrating/correcting it for 10-bit video will not cause banding like super wide gamut ones do, and also is being regularly reprofiled if it's an OLED which will suffer from white point and white level drift as the blue (and others) degrade.


I didn't go with an oled because i was worried about the degradation.


QUOTE(dragontamer8740 @ Mar 13 2022, 21:26) *

Also, funnily enough, even cheaper CRT TV's still have a better color gamut/volume than most modern consumer sets have; largely on account of being analogue (and also the standard color CRT phosphors). If you go to "wide gamut" sets you can beat them there, but you won't beat them for volume.


Yes, its quite interesting how good CRTs can look with modern inputs. My family that a giant CRT at one stage, that shit would be worth heaps now lol. It was too expensive to service though. It was also crazy heavy.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 14 2022, 17:21
Post #7272
Moonlight Rambler



Let's dance.
*********
Group: Gold Star Club
Posts: 6,487
Joined: 22-August 12
Level 373 (Dovahkiin)


QUOTE(EsotericSatire @ Mar 14 2022, 10:56) *
Yes, they are using REC.2020, for a wider color gamut, 10 bit color, enhanced sub-sampling and support HDR10 and Doly vision. Its most likely artificial enhancement on the older content to boost the color space.

The thing I can't understand is why they say the standard supports full bandwidth chroma subsampling but then they just use 4:2:0 anyway. It seems they leave it to the player / TV to convert back to 4:4:4 if its supported.

Seems to be a limitation of HDMI 2.0.
HDMI 2.0 can carry 30-bit RGB (10 bits per channel). Or YUV444. Just not progressive 4K YUV444@60Hz.
Something like dual link DVI at HDMI 2.0 data rates would solve this issue, but nooo.
That all being said, the correct solution would be to update the 1080p HDTV standard, call it "HDTV 2.0" or "XHDTV" some marketing BS like that, and make it support 1080p 10-bit RGB/YUV444/Rec.2020. Then you could use existing HDMI 2.0 data rates and get good colors.

The problem of course is that about twelve people would buy it, much like what happened with S-VHS. Most people buy based on resolution rather than actual quality. Like how they buy redder looking apples at the grocery even when (with things like honeycrisp) the best ones are yellower.

Also one thing to keep in mind, anime produced between the beginning of digital coloring and 2012 (at the very earliest) will probably have been made targeting narrower color spaces like rec.709 to begin with. So if displayed uncorrected on a wide gamut REC.2020 set it will appear oversaturated. You really can't win as a consumer.

QUOTE(EsotericSatire @ Mar 14 2022, 10:56) *
Looking at another source, to get 4.4.4 chroma subsampling you need the whole chain to be 18 Gbps compatible, the UHD in (4:4:4 or raw RGB), the player, the cable, and chip in the TV all correct. No wonder I had some much trouble going through HDMI cables to get ones that worked properly with my setup to get 'enhanced 18gbps' mode to work.
Yeah. This is why 4K is unnecessary. Adding (negotiated support for) Rec. 2020 on new 1080p HDTV's would be more than adequate. Overall image brightness would be higher, too.

QUOTE(EsotericSatire @ Mar 14 2022, 10:56) *
I was reading one article that said that it may be 10 years before UHD is worthwhile as most consumers do not understand or care about color space and chroma subsampling.
It's one of those things you have to do video for a while to figure out if you don't get a formal education about it.
QUOTE(EsotericSatire @ Mar 14 2022, 10:56) *
Is that if its ripped to a modern container? Blurays themselves seemed to have a host of issues.
Yes, this just means basically "a matroska file" or something of that nature with a good video codec that allows for high color depths and YUV444/RGB. Not one of the industry standard consumer media formats.
QUOTE(EsotericSatire @ Mar 14 2022, 10:56) *
Well everything looks boring in my TV's calibrated mode, so I assume the enhanced color presentation is not necessarily accurate.
Correct, although it also depends. What spec are you calibrating your TV's to?
If you calibrate a wide gamut screen to sRGB, it is of course going to look dull by comparison. If targeting rec.2020, however, it should look good. Make sure you are using a correction matrix, if you use displaycal/argyll CMS and your device is a colorimeter rather than a spectrophotometer.

QUOTE(EsotericSatire @ Mar 14 2022, 10:56) *
I was kind of confused by this. I thought Nvidia supported 10bit video now? Are they still dithering or some BS.
I can tell you my card is 8-bit. If the 3090 is 10-bit (I don't know if it is), I don't care because I'm not going to spend that much money for two bits of depth when I could just buy any radeon on the market for the last few years or use Intel integrated graphics.
QUOTE(EsotericSatire @ Mar 14 2022, 10:56) *
I checked the settings on my comp, seems locked to 8bit. (IMG:[invalid] style_emoticons/default/ohmy.gif) wtf.
Yeah, that's what I thought.
Nvidia needs to be the first against the wall when the revolution comes. I hope more people are coming to realize this. But as you say, most people don't know or care about stuff like color depth or subsampling so nvidia can get away with segmenting the market artificially.
Also, the Windows drivers for Radeons and some high end (but not top of the line professional) Nvidia GPU's don't work properly with 10-bit framebuffers where they will under modern Linux kernels.

quadros apparently support 30 bit color (with the "studio drivers"), and so does the Radeon HD 5970 from 2010 (http://web.archive.org/web/20100208033952/https://www.amd.com/us/products/desktop/graphics/ati-radeon-hd-5000/hd-5970/Pages/ati-radeon-hd-5970-specifications.aspx).

QUOTE(EsotericSatire @ Mar 14 2022, 10:56) *
I didn't go with an oled because i was worried about the degradation.
It is sort of possible to deal with in software corrections, but you'll never get the full brightness back and you'll lose some bit depth correcting for it.
QUOTE(EsotericSatire @ Mar 14 2022, 10:56) *
Yes, its quite interesting how good CRTs can look with modern inputs. My family that a giant CRT at one stage, that shit would be worth heaps now lol. It was too expensive to service though. It was also crazy heavy.
I have a 20" PVM-20L5 (not massive like 27-40" sets... but still) that uses SMPTE-C phosphors (not as bright as P-22 that most consumer sets use, but with better behaved primaries and a slightly better gamut) and does 720p. It really does look great in RGB/YPbPr.

Was yours a presentation monitor kind of deal (Mitsubishi made some of those, if I remember correctly)? Those things are often worth serious money.

This post has been edited by dragontamer8740: Mar 14 2022, 17:44
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 15 2022, 01:31
Post #7273
EsotericSatire



Look, Fat.
***********
Group: Catgirl Camarilla
Posts: 12,740
Joined: 31-July 10
Level 500 (Ponyslayer)


QUOTE(dragontamer8740 @ Mar 14 2022, 05:21) *

I can tell you my card is 8-bit. If the 3090 is 10-bit (I don't know if it is), I don't care because I'm not going to spend that much money for two bits of depth when I could just buy any radeon on the market for the last few years or use Intel integrated graphics.
Yeah, that's what I thought.
Nvidia needs to be the first against the wall when the revolution comes. I hope more people are coming to realize this. But as you say, most people don't know or care about stuff like color depth or subsampling so nvidia can get away with segmenting the market artificially.
Also, the Windows drivers for Radeons and some high end (but not top of the line professional) Nvidia GPU's don't work properly with 10-bit framebuffers where they will under modern Linux kernels.


The 10 bit support was BS. The studio driver is ancient, and is only 10 bit in photoshop and some other software that uses opengl. They didn't even update it for the 3000 series.


Did you see that nvidia got hacked and the ransom demand is for better open source linux drivers lol.

User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 15 2022, 01:44
Post #7274
Moonlight Rambler



Let's dance.
*********
Group: Gold Star Club
Posts: 6,487
Joined: 22-August 12
Level 373 (Dovahkiin)


QUOTE(EsotericSatire @ Mar 14 2022, 23:31) *
The 10 bit support was BS. The studio driver is ancient, and is only 10 bit in photoshop and some other software that uses opengl. They didn't even update it for the 3000 series.
Did you see that nvidia got hacked and the ransom demand is for better open source linux drivers lol.
No; I didn't.
That's excellent news, though, if true. They won't do anything, of course, but at least it makes them feel something.
>introduces firmware signing
>"oh don't worry, we'll sign firmware for you, this was totally just to stop people upgrading their artificially locked down cards/making counterfeits"
>doesn't sign firmware for them
>leaves the free software driver that people spent so much time and effort making through reverse engineering with zero help to rot

Looking into it, it may be possible that this group has firmware keys, or at least verilog. It would be pretty great if that's true and they follow through with it; not going to lie.
But I bet it's mostly posturing.
If it isn't, then I might be keeping my GPU a while longer than anticipated.

Also, did you calibrate your TV yourself, or is this one of those "precalibrated" things? Because in my (rich) friend's experience with good hardware, precalibrated stuff is usually done really fast so it can get out the factory door, rather than with quality in mind. A really nice calibration takes a long time; and if it's precalibrated you don't even know what space they're targeting. Additionally, I kind of doubt they let it warm up a full half hour when calibrating in the factory.

Calibration and profiling on my laptop turns it from a dull red-tinted screen into something pleasant to look at. My plasma looks nice, and it's also calibrated (and used alongside a LUT on the PC I've connected to it).

This post has been edited by dragontamer8740: Mar 15 2022, 02:11
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 15 2022, 14:09
Post #7275
EsotericSatire



Look, Fat.
***********
Group: Catgirl Camarilla
Posts: 12,740
Joined: 31-July 10
Level 500 (Ponyslayer)


QUOTE(dragontamer8740 @ Mar 14 2022, 13:44) *

Looking into it, it may be possible that this group has firmware keys, or at least verilog. It would be pretty great if that's true and they follow through with it; not going to lie.


Yeah they just kept making it harder and harder. I had a laptop that could install newer graphics chips but Nvidia kept making it hard to install the drivers. I think I got all they way up to the 680m, then it was it was too difficult to get the drivers to work. Just because Nvidia said the board was not certified, even though ti would otherwise work.




QUOTE(dragontamer8740 @ Mar 14 2022, 13:44) *

Also, did you calibrate your TV yourself, or is this one of those "precalibrated" things? Because in my (rich) friend's experience with good hardware, precalibrated stuff is usually done really fast so it can get out the factory door, rather than with quality in mind. A really nice calibration takes a long time; and if it's precalibrated you don't even know what space they're targeting. Additionally, I kind of doubt they let it warm up a full half hour when calibrating in the factory.

Calibration and profiling on my laptop turns it from a dull red-tinted screen into something pleasant to look at. My plasma looks nice, and it's also calibrated (and used alongside a LUT on the PC I've connected to it).



Just the factory calibrated mode checked by the whole saler. Not sure how good it is. The color calibration hardware seems too expensive to justify.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 15 2022, 16:33
Post #7276
Moonlight Rambler



Let's dance.
*********
Group: Gold Star Club
Posts: 6,487
Joined: 22-August 12
Level 373 (Dovahkiin)


QUOTE(EsotericSatire @ Mar 15 2022, 12:09) *
Yeah they just kept making it harder and harder. I had a laptop that could install newer graphics chips but Nvidia kept making it hard to install the drivers. I think I got all they way up to the 680m, then it was it was too difficult to get the drivers to work.
Yeah, kepler was the last generation that could work without signed firmware, if not counting the 750Ti and 750 Maxwells.

QUOTE(EsotericSatire @ Mar 15 2022, 12:09) *
Just because Nvidia said the board was not certified, even though ti would otherwise work.
Just the factory calibrated mode checked by the whole saler. Not sure how good it is.
The color calibration hardware seems too expensive to justify.
Unless your display was very cheap, yeah. it is quite overpriced in terms of the parts that make it up.

But my primary laptop's screen went from a red mess to something kind of nice to look at. My spare laptop's overly blue mess became much more pleasant as well. It also fixed the gamma curve on my powerbook running linux (since old macs used a 1.8 gamma function).

Why you might want to try one out:

If you can find someone to borrow a colorimeter from, I highly recommend trying one out with DisplayCAL or similar (the manufacturer software is usually crap except for the most expensive bundles).

If you want a recommendation, try to get a used Colormunki Display. That's the colorimeter I have; it's basically the same as the i1 Display Pro except it's been slightly crippled in firmware for a couple things you probably won't need to worry about. Basically it's a little slower than the i1 Display Pro. For 'refresh' screens (emissive ones, rather than transmissive like LCD's) it will be slightly less accurate unless you make it do more patches when profiling on account of not being able to synchronize to the redraw of the display. The software they bundle is also inferior, but since it can be used with Argyll CMS/DisplayCAL that is a non-issue. So all it boils down to is taking longer to calibrate. Also it's limited to a mere 1000 nits instead of the 2000 of the $320 version.

Colors actually match (or as closely as possible, if out of gamut) across my devices' screens, now. The worse the display, the bigger of a difference profiling/calibration will make. For better ones the results aren't as noticeable, unless you are targeting a non-native gamut on a wide gamut screen. The thing it's best for is fixing gamma ramps and incorrect white points in those cases.

They rebranded that device recently; it's now sold as the "ColorChecker Display" by a spinoff of X-Rite.

[argyllcms.com] Here's a good write-up on the differences between versions of the device by the Argyll CMS author. They're cheap enough used that they might be a decent way to get your feet wet.

This post has been edited by dragontamer8740: Mar 15 2022, 22:40
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 16 2022, 06:44
Post #7277
Necromusume



ΣΚΙΒΙΔΙ ΣΚΙΒΙΔΙ
*********
Group: Catgirl Camarilla
Posts: 7,162
Joined: 17-May 12
Level 500 (Ponyslayer)


Web designers who use sliders & carousels should star in a youtube video featuring a torture test of a machete.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 16 2022, 20:58
Post #7278
cate_chan



Technekololigy Enthusiast
****
Group: Members
Posts: 406
Joined: 4-May 18
Level 114 (Ascended)


QUOTE(Necromusume @ Mar 16 2022, 06:44) *

Web designers who use sliders & carousels should star in a youtube video featuring a torture test of a machete.
in general the phone-era spanwed design is misery to no end, why do I have slide out sidebars with huge buttons on a webbrowser? why is everything rounded? where are the buttons?
its all so tiresome

speaking of misery, POE is far more complicated than I initially thought. I always assumed it was just dc over the spare pair but theres a couple standards and even some handshaking/class identification. that being if you want to support all POE switches with a device for the 'it just works' experience.

This post has been edited by cate_chan: Mar 16 2022, 20:59
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 16 2022, 23:35
Post #7279
Moonlight Rambler



Let's dance.
*********
Group: Gold Star Club
Posts: 6,487
Joined: 22-August 12
Level 373 (Dovahkiin)


QUOTE(Necromusume @ Mar 16 2022, 04:44) *
Web designers who use sliders & carousels should star in a youtube video featuring a torture test of a machete.
(IMG:[imgs.xkcd.com] https://imgs.xkcd.com/comics/slideshow.gif)
I will extend this to anyone who uses WebComponents or requires javascript for basic functionality of their site (like "to read text").

My website (which I will not link to) renders everything properly in Netsurf, which is the browser I think everyone should really be targeting to determine if their ideas are good or not.
User is offlineProfile CardPM
Go to the top of the page
+Quote Post

 
post Mar 19 2022, 01:50
Post #7280
Moonlight Rambler



Let's dance.
*********
Group: Gold Star Club
Posts: 6,487
Joined: 22-August 12
Level 373 (Dovahkiin)


Anyone else unable to log in to pixiv with an existing account? Like, there's nowhere to do it; just "sign in with <external cancer service>."

Attached Image

Edit:
turning on:
CODE
dom.webcomponents.enabled
dom.webcomponents.customelements.enabled
…fixed it.

Every time a new site uses WebComponents, the more I hope whatever Google engineer proposed it's house burns down (though without family inside because I'm not a complete monster).

This post has been edited by dragontamer8740: Mar 19 2022, 01:55
User is offlineProfile CardPM
Go to the top of the page
+Quote Post


454 Pages V « < 362 363 364 365 366 > » 
Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 


Lo-Fi Version Time is now: 6th August 2025 - 20:41