 |
 |
 |
What is the last thing you thought?, Tech Edition |
|
Oct 8 2021, 13:45
|
EsotericSatire
Group: Catgirl Camarilla
Posts: 12,763
Joined: 31-July 10

|
Thanks for the offer but I probably don't need one, I have three mechanical and X number of membrane keyboards around.
Hmmm I haven't really had an issue with cherry mx switches breaking o.O. I imagine if you were taking the keycaps on and off a lot that they might.
I type decently fast so if I use keycaps that are too loud it makes a fair bit of noise.
In high school they had an exam where we had to get to at least 60 WPM and ideally 100 WPM with 95% accuracy. These days however apart from people doing transcription of audio, people don't seem to type as fast.
My typing speed now is probably only 60-70 WPM and people think that's fast.
|
|
|
|
 |
|
Oct 8 2021, 16:51
|
Wayward_Vagabond
Group: Gold Star Club
Posts: 6,305
Joined: 22-March 09

|
Ended up going with a Supermicro X10SLH-F board, Xeon E3-1271 v3 CPU, a copper slug Intel LGA15xx OEM cooler, 32GB of 1600MHz ECC RAM, and the RX580 I already had. I have a sata MX500 I can't remember the sive of that'll work fine for a boot drive. PSU is still in the air, but aiming for a 600W or so platinum SFX. I have a full size 645W Supermicro supply I can use for testing anyhow.
Diagnosis on small computer: board has a ghost in it. New ram and new PSU didn't help, but I noticed it was tight on the standoffs when I pulled it. A different board wouldn't just drop on the standoffs, I had to scoot one towards the edge of the board a bit. It's not implausible that a year+ of on time, minus 10 minutes to move it ~250 days ago, with strain on the board could damage it. RIP.
|
|
|
|
 |
|
Oct 8 2021, 17:07
|
Wayward_Vagabond
Group: Gold Star Club
Posts: 6,305
Joined: 22-March 09

|
Dubs. Still trying to figure out if sound will an old xonar/creative card, or a usb widget. Wifi will likely be a dongle I have shoved into the weird on-motherboard usb-3 port. At least the board has a standalone grahics chip for diagnostics, a serial port, a header for another serial, and plenty of 4-pin fan headers.
|
|
|
|
 |
|
Oct 8 2021, 17:24
|
Moonlight Rambler
Group: Gold Star Club
Posts: 6,498
Joined: 22-August 12

|
QUOTE(EsotericSatire @ Oct 8 2021, 07:45)  Thanks for the offer but I probably don't need one, I have three mechanical and X number of membrane keyboards around. Fun fact, model M's are technically membrane keyboards. QUOTE(EsotericSatire @ Oct 8 2021, 07:45)  Hmmm I haven't really had an issue with cherry mx switches breaking o.O. I imagine if you were taking the keycaps on and off a lot that they might. It's not the key switches, but the keycaps, where they couple with the stems. I think the real issue is quality control; I've not heard much about first party cherry caps breaking. QUOTE(EsotericSatire @ Oct 8 2021, 07:45)  I type decently fast so if I use keycaps that are too loud it makes a fair bit of noise. Yeah, i learned to stop worrying and embrace the noise. If I want less noise I pull out the Apple board or a rubberdome Sun Type 5 board. QUOTE(EsotericSatire @ Oct 8 2021, 07:45)  In high school they had an exam where we had to get to at least 60 WPM and ideally 100 WPM with 95% accuracy. These days however apart from people doing transcription of audio, people don't seem to type as fast. I get around 70-80 I think, last I checked. QUOTE(EsotericSatire @ Oct 8 2021, 07:45)  My typing speed now is probably only 60-70 WPM and people think that's fast. Relatively speaking, it is. QUOTE(Wayward_Vagabond @ Oct 8 2021, 11:07)  Still trying to figure out if sound will an old xonar/creative card, or a usb widget. I use an audigy 2 in mine; depending on your OS old sound cards can be a fine option. This post has been edited by dragontamer8740: Oct 8 2021, 17:26
|
|
|
|
 |
|
Oct 8 2021, 18:29
|
Wayward_Vagabond
Group: Gold Star Club
Posts: 6,305
Joined: 22-March 09

|
Xubuntu 20.04LTS Actually, the PSU in the case already should run the mobo, just no RX580.
Why don't VGA to Displayport adapters/dongles seem to exist? Nothing on Newegg, nothing on eBay (save for typos). There's a $36 6'long active cable and a $60 box on Amazon. Oi.
This post has been edited by Wayward_Vagabond: Oct 8 2021, 21:05
|
|
|
|
 |
|
Oct 8 2021, 23:55
|
cate_chan
Group: Members
Posts: 406
Joined: 4-May 18

|
so I originally bought a server to rid myself of the pains of using consumer hardware for server tasks (machine froze before because of one bad drive). I somehow assumed server 'enterprise' hardware wouldnt have these kinds of issues but it couldnt be further from my expectations. just getting the shitty server I got a while ago setup I've run into: - every linux distro kernel panicking because the monitor resolution crashed something
- every linux distro kernel panicking because the raid controller was using out of date firmware
- inability to mix sas and sata drives within one raid, ruining my plan to throw in any drives I had laying around
- the raid controller not recognizing certain brands/types of sas drives at random
- the raid controller generally being a pain in the dick to deal with online, like claiming the array 'is busy transforming' forcing me to reboot and configure it from the option rom
- the entire system crashing for unknown reasons
- the entire system not booting because ram is in the wrong slots
maybe when I finally have it all setup and running I'll be more positive about the 'enterprise' experience. on the plus side, acquiring a lot of nice hardware during the process of sinking money into this stupid system. if anything breaks I'll have replacements in the future This post has been edited by cate_chan: Oct 8 2021, 23:56
|
|
|
|
 |
|
Oct 9 2021, 00:01
|
Moonlight Rambler
Group: Gold Star Club
Posts: 6,498
Joined: 22-August 12

|
QUOTE(cate_chan @ Oct 8 2021, 17:55)  so I originally bought a server to rid myself of the pains of using consumer hardware for server tasks (machine froze before because of one bad drive). I somehow assumed server 'enterprise' hardware wouldnt have these kinds of issues but it couldnt be further from my expectations. Yeah, I was telling Wayward basically this on IRC a day or two ago. They're just regular PC's, but the parts are more expensive and there's usually way more stuff to go wrong/interfere on the buses. QUOTE(cate_chan @ Oct 8 2021, 17:55)  just getting the shitty server I got a while ago setup I've run into: - every linux distro kernel panicking because the monitor resolution crashed something
- every linux distro kernel panicking because the raid controller was using out of date firmware
- inability to mix sas and sata drives within one raid, ruining my plan to throw in any drives I had laying around
- the raid controller not recognizing certain brands/types of sas drives at random
- the raid controller generally being a pain in the dick to deal with online, like claiming the array 'is busy transforming' forcing me to reboot and configure it from the option rom
- the entire system crashing for unknown reasons
- the entire system not booting because ram is in the wrong slots
maybe when I finally have it all setup and running I'll be more positive about the 'enterprise' experience Nah, enterprise equipment is based on the same basic architecture as consumer gear is. Redundant PSU's are nice, but they still use the same execution cores and such. They just tend to have nice things like SCSI/SAS controllers built-in. Also, my advice is to never use hardware RAID if you can avoid it. Software RAID has relatively little overhead and is a lot easier to recover than if your RAID controller fails. Also, you may want to try ZFS, which is actually usually more performant in RAID when not using a hardware RAID controller. ([ en.wikipedia.org] ZFS: Wikipedia) I had to boot a "bootstrap" bootloader off a floppy disk to get Debian on my old IBM server because it refused to boot from a USB. If anything, enterprise gear is more user-hostile, in that little to no effort has been made to make things plug-n-play and the operator is expected to know the manual inside and out. BTW, I'm not saying "don't get enterprise gear," I'm just saying "most of your problems won't be solved simply by virtue of using enterprise gear." It's not really made to that much higher of a standard than (reasonably good quality) consumer equipment, with the exception of stuff like redundant PSU's; it's just backed by a support contract so the operator is off the hook when hardware fails, and looks less ugly than RGB LED consumer gear. I guess things like HDD caddies and storage backplanes are convenient, too, but I wouldn't spend the extra money just for that unless I'm swapping HDD's like game console cartridges. It also might not be made to meet the same FCC regulations, specifically, it might be made to meet FCC part 15 class A instead of being made to meet class B like most consumer gear is. My IBM terminal is an (ancient) example of this. That basically means it might interfere more with radio stuff than consumer gear would. This post has been edited by dragontamer8740: Oct 9 2021, 00:17
|
|
|
|
 |
|
Oct 9 2021, 00:59
|
cate_chan
Group: Members
Posts: 406
Joined: 4-May 18

|
QUOTE(dragontamer8740 @ Oct 9 2021, 00:01)  Also, my advice is to never use hardware RAID if you can avoid it. Software RAID has relatively little overhead and is a lot easier to recover than if your RAID controller fails. Also, you may want to try ZFS, which is actually usually more performant in RAID when not using a hardware RAID controller. ([ en.wikipedia.org] ZFS: Wikipedia) I do prefer hardware raid based on being able to put everything including the os on the redundant storage. I have software raid on my current janky setup with mdadm but its a bit of a nightmare to do it for boot/os drive. so the os has its own dedicated ssd, which also died in the heat this year. with hardware raid its all nicely usable in the os. and its decently managable from the os with interface tools like ssacli (when its cooperating with the drives I put in). in terms of performance, async nfs always saturates my lan network speeds regardless of how terrible the storage is, so its kinda non point fo rme QUOTE(dragontamer8740 @ Oct 9 2021, 00:01)  I had to boot a "bootstrap" bootloader off a floppy disk to get Debian on my old IBM server because it refused to boot from a USB.
I have a 'plop boot manager' cd somewhere that allows terrible hardware to boot from usb through that, many oddball cases solved with that. QUOTE(dragontamer8740 @ Oct 9 2021, 00:01)  If anything, enterprise gear is more user-hostile, in that little to no effort has been made to make things plug-n-play and the operator is expected to know the manual inside and out.
BTW, I'm not saying "don't get enterprise gear," I'm just saying "most of your problems won't be solved simply by virtue of using enterprise gear." It's not really made to that much higher of a standard than (reasonably good quality) consumer equipment, with the exception of stuff like redundant PSU's; it's just backed by a support contract so the operator is off the hook when hardware fails, and looks less ugly than RGB LED consumer gear.
I guess things like HDD caddies and storage backplanes are convenient, too, but I wouldn't spend the extra money just for that unless I'm swapping HDD's like game console cartridges.
I just liked the idea of dasblinkenlights hdd arrays, and when its all working it seems very nice to have. the issue is just the initial setup of it all, when its finally running it'll be nicer to keep running, probably. also probably doesnt help this is a reasonably old system I bought entirely untested. without much prior knowledge. in 20/20 hindsight, shouldve probably gotten some dell hardware instead of hp. but yeah, the whole 'enterprise' thing is mostly contracts and the price tag. when you're dealing with it from the point of second hand hardware and doing it all yourself theres not a whole lot it has in terms of stability over consumer hardware. does scale a lot nicer though with things like backplanes and raidcontrollers. and the rack format as a whole. This post has been edited by cate_chan: Oct 9 2021, 01:06
|
|
|
|
 |
|
Oct 9 2021, 02:52
|
Moonlight Rambler
Group: Gold Star Club
Posts: 6,498
Joined: 22-August 12

|
I wonder if the temperature shut-off function works the same in the electric kettles for green tea (80°C or so) as for the ones that just boil water. Apparently, the boiling water ones have something to do with steam pressure. I have a boiler kettle, but my favourite tea is genmaicha so I'm sort of getting annoyed with trying to figure out a method to reliably get a good brew. No, I do not have a thermometer. And no, I will not buy a thermometer right now. QUOTE(cate_chan @ Oct 8 2021, 18:59)  I do prefer hardware raid based on being able to put everything including the os on the redundant storage. I have software raid on my current janky setup with mdadm but its a bit of a nightmare to do it for boot/os drive. so the os has its own dedicated ssd, which also died in the heat this year. with hardware raid its all nicely usable in the os. and its decently managable from the os with interface tools like ssacli (when its cooperating with the drives I put in). Looks like you can boot a ZFS raid root. Also, I don't use SSD's and as a result haven't had a disk die in about a decade. Most of my disks are at around 50-60k hours (5-7 years) in terms of use, and most were manufactured prior to 2010. Not trying to say you shouldn't use SSD's; just saying I've gotten along fine without SSD's and I have seen a lot of them fail for others while my disks seem to keep swimming. Survivorship bias, yeah, but I don't trust SSD's for anything that I would be upset if I lost tomorrow. I do have one SSD, which was given to me used for free, which is in my machine on an evaluation basis and which I might use when compiling firefox or something. QUOTE(cate_chan @ Oct 8 2021, 18:59)  I have a 'plop boot manager' cd somewhere that allows terrible hardware to boot from usb through that, many oddball cases solved with that. I have one, too, but it was useless for installing Debian on my 2006 imac because even though the installer ran, it tried to install a 64 bit UEFI image instead of using CSM and therefore the installed OS did not boot. Had to master a Debian DVD with the UEFI boot binaries ripped out. Plop's a cool tool, but it's a shame it's closed source. I think I have a CD-R with it somewhere, still. The floppy I made had GRUB on it, and I used it to get the installer to boot off a flash drive. QUOTE(cate_chan @ Oct 8 2021, 18:59)  also probably doesnt help this is a reasonably old system I bought entirely untested. without much prior knowledge. that's how I learned on the IBM I had. I think that's the only way anyone learns anything worth knowing in tech, tbh. QUOTE(cate_chan @ Oct 8 2021, 18:59)  in 20/20 hindsight, shouldve probably gotten some dell hardware instead of hp. Definitely. Although I have a bone to pick with current year dell (for example that pathetic machine they're daring to call a "latitude" laptop now), they made good stuff for a long time and weren't fucking stupid about it like HP always is. The IBM I had was actually pretty good about that, too, by comparison. Some perverse, twisted part of me really wants an HP-UX box, though. Either 68000 or Itanium-based. Maybe even PA-RISC, but I think it would be more horrible to install and run Debian on Itanic or m68k HP-UX boxes - and thus more prestigious. PA-RISC still has a debian port, btw. But it just doesn't interest me like m68k and Itanic do. I'd like m68k for being a fabulous, classic CISC design - and from a period when I had a modicum of respect for HP's products. I'd like Itanic for the same kinds of reasons that people love terrible movies. It's an absolute wreck, from what I can tell. And PA-RISC wanted to be DEC Alpha, but just never could be. Oh yeah, also HP apparently put HP-UX (or at least some of it) in ROM in its early workstations. Lol. I don't think I'll ever be able to afford a secondhand Itanium machine, now or in the future. They were just too expensive when new and they're way too niche. And even if I could, i'd rather buy a POWER, Alpha, or SPARC-based machine with that money. P.S.: not all enterprise gear is in rack format, unfortunately. For me, the single coolest thing about enterprise hardware is that not all of it is x86, and some of it is even still somewhat performant. So that's where my mind goes when I think about buying any enterprise gear. This post has been edited by dragontamer8740: Oct 9 2021, 03:14
|
|
|
|
 |
|
Oct 9 2021, 11:10
|
EsotericSatire
Group: Catgirl Camarilla
Posts: 12,763
Joined: 31-July 10

|
QUOTE(dragontamer8740 @ Oct 8 2021, 05:24)  Fun fact, model M's are technically membrane keyboards. It's not the key switches, but the keycaps, where they couple with the stems. I think the real issue is quality control; I've not heard much about first party cherry caps breaking.
Oh interesting. I think my of my keyboards used cherry mx switches with higher quality keycaps. QUOTE(dragontamer8740 @ Oct 8 2021, 05:24)  Yeah, i learned to stop worrying and embrace the noise. If I want less noise I pull out the Apple board or a rubberdome Sun Type 5 board.I get around 70-80 I think, last I checked.
I just did a test and hit 80wpm pretty easily. *scratches head* Maybe I've been ranting online too much ROFL. (This is possible) Normally I'd get back up to 100-110 wpm if I was doing a audio transcription job. Then it drops back down. I wonder if they keyboard makes an impact like membrane vs mechanical? QUOTE(dragontamer8740 @ Oct 8 2021, 05:24)  Relatively speaking, it is.I use an audigy 2 in mine; depending on your OS old sound cards can be a fine option.
I've got a Asus STX Essence with the upgraded op-amps. Haven't used it for a while as I have a better external DAC.
|
|
|
|
 |
|
Oct 9 2021, 14:17
|
Wayward_Vagabond
Group: Gold Star Club
Posts: 6,305
Joined: 22-March 09

|
Got my PC case yesterday; motherboard and cooler projected today. Ram tuesday, and CPU is ? as it hasn't even shipped. Still waiting to see if I win the sound card or PSU. Xeon sticker shipped but no tracking. I'll probally gank the LG bluray burner out of the optiplex for this box, and stick a DVD+/-RW drive in there. I quite like the case seing it in person, pretty simple and clean, looks like it could be enterprise.
It looks like the cooler comes pre-pasted with a grey compound. The box it was shown with belonged to a Xeon v6 chip. Would it be avsiable to repaste it? Leaning towards yes, and also leaning towards I should use something a bit better than generic white thermal grease.
As for the VGA to DP, I already have an HDMI to DP dongle, and VGA to HDMI ones are cheap. It just feels dumb chaining two active dongles in a row. I suppose latency isn't really a concern on anything I'd be using VGA for unless it's so bad audio leads the video.
This post has been edited by Wayward_Vagabond: Oct 9 2021, 14:22
|
|
|
|
 |
|
Oct 9 2021, 22:14
|
uareader
Group: Catgirl Camarilla
Posts: 5,594
Joined: 1-September 14

|
My air cooler can also produce heat, and so far it seems to be better than normal heating thing: heat the room faster, don't let the cold reconquer as easily. But we're just in the beginning of cold temperatures. Will have to see if it keep up.
|
|
|
|
 |
|
Oct 10 2021, 00:23
|
Moonlight Rambler
Group: Gold Star Club
Posts: 6,498
Joined: 22-August 12

|
QUOTE(Wayward_Vagabond @ Oct 9 2021, 08:17)  It looks like the cooler comes pre-pasted with a grey compound. The box it was shown with belonged to a Xeon v6 chip. Would it be avsiable to repaste it? Leaning towards yes, and also leaning towards I should use something a bit better than generic white thermal grease. Meh, it's probably fine. I reused the same gray clay compound on my CPU through multiple re-seatings of the heat sink. But it depends on the power consumption/heat of the cpu. If it's fresh thermal compound (meaning unused) it's probably fine. I only switched thermal compounds when I replaced the CPU entirely while keeping the original heat sink (a copper cored intel one). This post has been edited by dragontamer8740: Oct 10 2021, 00:24
|
|
|
|
 |
|
Oct 10 2021, 03:50
|
Wayward_Vagabond
Group: Gold Star Club
Posts: 6,305
Joined: 22-March 09

|
Joke's on me, they sent me a used cooler with no compound on it instead. And this bit isn't their fault- the stock cooler seems to use some bizzare rivet type mouting setup, but the backplate on the mobo is threaded posts. Current backplate is also stuck on there really good- to the point where I'm afraid to apply more force to try and remove it.
Somebody on IRC suggested Scythe Shuriken 2 as a cooler, and that looks perfect. Accepts normal 92mm fans, and it's mounting scheme lets me use any backplate I want- there's a base that sits above the board, and uses hardware through standoffs to attach to a backplate. Springs are on the CPU side of the bracket. Probally better thermal performance that the copper slug cooler.
Board came with a Dynatron 2U cooler equiped with a 60x25mm 0.45A fan. It's noisy.
This post has been edited by Wayward_Vagabond: Oct 10 2021, 07:37
|
|
|
|
 |
|
Oct 10 2021, 23:06
|
Moonlight Rambler
Group: Gold Star Club
Posts: 6,498
Joined: 22-August 12

|
Rewired the entirety of my computer today so that i'd be able to sit a couple 2.5" drives in free spaces in the case to make backups of my laptop HDD's.
It's great that one of my disks (an old hitachi deskstar) was old enough to have both Molex and Sata power connectors on it. Meant I could make a Molex useful without getting one of those scary adapters.
On account of this, I currently have nine disks attached to my desktop.
|
|
|
|
 |
|
Oct 11 2021, 04:44
|
EsotericSatire
Group: Catgirl Camarilla
Posts: 12,763
Joined: 31-July 10

|
QUOTE(dragontamer8740 @ Oct 10 2021, 11:06)  Rewired the entirety of my computer today so that i'd be able to sit a couple 2.5" drives in free spaces in the case to make backups of my laptop HDD's.
It's great that one of my disks (an old hitachi deskstar) was old enough to have both Molex and Sata power connectors on it. Meant I could make a Molex useful without getting one of those scary adapters.
On account of this, I currently have nine disks attached to my desktop.
Hopefully that's not 'death star' model. That's a lot of attached drives, is the PSU okay? I think the only time I had issues was when I had the giant mining rig with 10 drives and four graphics cards. I had failed to account for the rail load of the HDDs. The giant case did have space for two more PSUs though. edit: Must be something wrong with Google's algorithm, they keep showing me ads for gay dating services. Also my work keeps asking me to fill out a survey for the workplace experience of LGBTQQ++ people. Weirdness. This post has been edited by EsotericSatire: Oct 11 2021, 06:12
|
|
|
|
 |
|
Oct 11 2021, 06:25
|
Moonlight Rambler
Group: Gold Star Club
Posts: 6,498
Joined: 22-August 12

|
QUOTE(EsotericSatire @ Oct 10 2021, 22:44)  Hopefully that's not 'death star' model. they're hitachi era, so newer than the death stars. June 2007 and November 2008. QUOTE(EsotericSatire @ Oct 10 2021, 22:44)  Thats a lot of attached drives, is the PSU okay? I think the only time I had issues was when I had the giant mining rigg with 10 drives and four graphics cards. I had failed to account for the rail load of the HDDs. The giant case did have space for two more PSUs though. It's a dinky little 430W PSU of the old variety with the 115/220v mains switch, but it seems to be doing fine. I'm using a 750 Ti, which only needs power from the PCIe bus, and an ivy bridge CPU with a ~70W TDP. Also, two of the drives are 2.5" HDD's that will be put back in my laptop as soon as the backup finishes, and one is a 2.5" SSD that i've just slid under the lowest HDD. It's just amusing to see /dev/sdi mounted. The "appointed for life" drives are two old deskstars (320GB June 2007, 500GB Nov. 2008), an old 500GB WD green with the idle3 timer turned off (though the head parked 1 million times first) from May 2008, my boot disk (a Seagate 500GB from March 2008), a 3TB WD black from January 2015, and the new 6TB WD Red Plus (July 2021). The 480GB SSD is there for compiling huge programs and stuff; I trust every single HDD in my desktop (even the 13 year old ones) more than I trust it. Thinking if I move everything to the two newer drives, I might try a RAID with the three 500GB ones. thought: I wish modern consumer HDD models had comprehensive publicly available documentation like this SCSI Barracuda from 1996 that I have got. [ www.seagate.com] https://www.seagate.com/support/disc/manuals/scsi/28880d.pdfOf course if they did that people would realize how much market segmentation is being employed. Or wouldn't buy the expensive SAS disks. I kind of doubt people would actually care; WD already lies about SMR disks being acceptable in an NAS and they're still selling. Yes; I know my GPU is in an x4 PCIe 2.0 slot. I haven't noticed any actual performance impact vs. the x16 PCIe 3.0 slot. So I don't care. Doing this frees up a conventional PCI slot that the GPU would otherwise obscure. Also I can't find two of my HDD mounting rails, which is why there's a free spot there. [ i.imgur.com] (IMG:[i.imgur.com] https://i.imgur.com/1qjcvYTg.jpg) [ i.imgur.com] (IMG:[i.imgur.com] https://i.imgur.com/dRU5lNDg.jpg) [ i.imgur.com] (IMG:[i.imgur.com] https://i.imgur.com/KX5M04Pg.jpg) This post has been edited by dragontamer8740: Oct 11 2021, 07:21
|
|
|
|
 |
|
Oct 11 2021, 13:43
|
Wayward_Vagabond
Group: Gold Star Club
Posts: 6,305
Joined: 22-March 09

|
I need to stop being lazy and do the PSU swap in my big computer. Hesitant to have it torn apart until I get my linux box up though. I won the PSU and sound card (didn't even get sniped), but CPU hasn't shipped yet. I suppose I can't have it fully assembled without a PSU anyways.. I have a Scythe cooler on the way. Wonder if it's worth upgrading the 60MM exhasut fans on the case.. Case has front and top vents, so it'll basically be negative pressure with the PSU, GPU, and rear case fans shoving hot air out the back.
This post has been edited by Wayward_Vagabond: Oct 11 2021, 14:10
|
|
|
|
 |
|
Oct 11 2021, 21:48
|
cate_chan
Group: Members
Posts: 406
Joined: 4-May 18

|
QUOTE(dragontamer8740 @ Oct 11 2021, 06:25)  you gonna hook up that floppy drive just sitting there all alone? besides that very nice setup and case. in other thoughts: luckily the weather is getting more favorable for hardware, soon It'll be 'jager cooling in windowsill with sub 20C thermal zones' levels of cool again. This post has been edited by cate_chan: Oct 11 2021, 21:51
|
|
|
|
 |
|
Oct 11 2021, 21:58
|
Moonlight Rambler
Group: Gold Star Club
Posts: 6,498
Joined: 22-August 12

|
QUOTE(cate_chan @ Oct 11 2021, 15:48)  you gonna hook up that floppy drive just sitting there all alone?
I'd love to, and it was hooked in before I started shuffling everything around, but there's under a millimeter of clearance between it and one of my hard disks underneath it if I do. BTW, that floppy drive is new old stock I got for $1 earlier this year. Should I do it anyway? QUOTE(cate_chan @ Oct 11 2021, 15:48)  besides that very nice setup and case. Thanks; the case was given to me for free around 2015. The setup inside it is pretty simplistic but it's plenty good enough for me. I keep thinking about spraypainting the window black. I don't really care for windows on cases, and this one's just plastic so it's got scratches and such. QUOTE(cate_chan @ Oct 11 2021, 15:48)  in other thoughts: luckily the weather is getting more favorable for hardware, soon It'll be 'jager cooling in windowsill with sub 20C thermal zones' levels of cool again. Ugh, don't remind me. Times like this I wish I still had my netburst dual CPU space heater. This post has been edited by dragontamer8740: Oct 11 2021, 21:59
|
|
|
|
 |
|
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:
|
 |
 |
 |
|